Feb 23 01:36:01 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Feb 23 01:36:01 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Feb 23 01:36:01 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 23 01:36:01 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 23 01:36:01 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 23 01:36:01 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 23 01:36:01 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 23 01:36:01 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Feb 23 01:36:01 localhost kernel: signal: max sigframe size: 1776 Feb 23 01:36:01 localhost kernel: BIOS-provided physical RAM map: Feb 23 01:36:01 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Feb 23 01:36:01 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Feb 23 01:36:01 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Feb 23 01:36:01 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Feb 23 01:36:01 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Feb 23 01:36:01 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Feb 23 01:36:01 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Feb 23 01:36:01 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Feb 23 01:36:01 localhost kernel: NX (Execute Disable) protection: active Feb 23 01:36:01 localhost kernel: SMBIOS 2.8 present. Feb 23 01:36:01 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Feb 23 01:36:01 localhost kernel: Hypervisor detected: KVM Feb 23 01:36:01 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Feb 23 01:36:01 localhost kernel: kvm-clock: using sched offset of 2904884916 cycles Feb 23 01:36:01 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Feb 23 01:36:01 localhost kernel: tsc: Detected 2799.998 MHz processor Feb 23 01:36:01 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Feb 23 01:36:01 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 23 01:36:01 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Feb 23 01:36:01 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Feb 23 01:36:01 localhost kernel: Using GB pages for direct mapping Feb 23 01:36:01 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Feb 23 01:36:01 localhost kernel: ACPI: Early table checksum verification disabled Feb 23 01:36:01 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Feb 23 01:36:01 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 23 01:36:01 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 23 01:36:01 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 23 01:36:01 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Feb 23 01:36:01 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 23 01:36:01 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 23 01:36:01 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Feb 23 01:36:01 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Feb 23 01:36:01 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Feb 23 01:36:01 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Feb 23 01:36:01 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Feb 23 01:36:01 localhost kernel: No NUMA configuration found Feb 23 01:36:01 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Feb 23 01:36:01 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff] Feb 23 01:36:01 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Feb 23 01:36:01 localhost kernel: Zone ranges: Feb 23 01:36:01 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 23 01:36:01 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Feb 23 01:36:01 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Feb 23 01:36:01 localhost kernel: Device empty Feb 23 01:36:01 localhost kernel: Movable zone start for each node Feb 23 01:36:01 localhost kernel: Early memory node ranges Feb 23 01:36:01 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Feb 23 01:36:01 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Feb 23 01:36:01 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Feb 23 01:36:01 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Feb 23 01:36:01 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 23 01:36:01 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Feb 23 01:36:01 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Feb 23 01:36:01 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Feb 23 01:36:01 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Feb 23 01:36:01 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Feb 23 01:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 23 01:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Feb 23 01:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 23 01:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Feb 23 01:36:01 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Feb 23 01:36:01 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 23 01:36:01 localhost kernel: TSC deadline timer available Feb 23 01:36:01 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Feb 23 01:36:01 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Feb 23 01:36:01 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Feb 23 01:36:01 localhost kernel: Booting paravirtualized kernel on KVM Feb 23 01:36:01 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 23 01:36:01 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Feb 23 01:36:01 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Feb 23 01:36:01 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Feb 23 01:36:01 localhost kernel: Fallback order for Node 0: 0 Feb 23 01:36:01 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Feb 23 01:36:01 localhost kernel: Policy zone: Normal Feb 23 01:36:01 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 23 01:36:01 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Feb 23 01:36:01 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Feb 23 01:36:01 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Feb 23 01:36:01 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 23 01:36:01 localhost kernel: software IO TLB: area num 8. Feb 23 01:36:01 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved) Feb 23 01:36:01 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Feb 23 01:36:01 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Feb 23 01:36:01 localhost kernel: ftrace: allocating 44803 entries in 176 pages Feb 23 01:36:01 localhost kernel: ftrace: allocated 176 pages with 3 groups Feb 23 01:36:01 localhost kernel: Dynamic Preempt: voluntary Feb 23 01:36:01 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Feb 23 01:36:01 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Feb 23 01:36:01 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Feb 23 01:36:01 localhost kernel: #011Rude variant of Tasks RCU enabled. Feb 23 01:36:01 localhost kernel: #011Tracing variant of Tasks RCU enabled. Feb 23 01:36:01 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 23 01:36:01 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Feb 23 01:36:01 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Feb 23 01:36:01 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Feb 23 01:36:01 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Feb 23 01:36:01 localhost kernel: random: crng init done (trusting CPU's manufacturer) Feb 23 01:36:01 localhost kernel: Console: colour VGA+ 80x25 Feb 23 01:36:01 localhost kernel: printk: console [tty0] enabled Feb 23 01:36:01 localhost kernel: printk: console [ttyS0] enabled Feb 23 01:36:01 localhost kernel: ACPI: Core revision 20211217 Feb 23 01:36:01 localhost kernel: APIC: Switch to symmetric I/O mode setup Feb 23 01:36:01 localhost kernel: x2apic enabled Feb 23 01:36:01 localhost kernel: Switched APIC routing to physical x2apic. Feb 23 01:36:01 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Feb 23 01:36:01 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Feb 23 01:36:01 localhost kernel: pid_max: default: 32768 minimum: 301 Feb 23 01:36:01 localhost kernel: LSM: Security Framework initializing Feb 23 01:36:01 localhost kernel: Yama: becoming mindful. Feb 23 01:36:01 localhost kernel: SELinux: Initializing. Feb 23 01:36:01 localhost kernel: LSM support for eBPF active Feb 23 01:36:01 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 23 01:36:01 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 23 01:36:01 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Feb 23 01:36:01 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Feb 23 01:36:01 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Feb 23 01:36:01 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 23 01:36:01 localhost kernel: Spectre V2 : Mitigation: Retpolines Feb 23 01:36:01 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 23 01:36:01 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Feb 23 01:36:01 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Feb 23 01:36:01 localhost kernel: RETBleed: Mitigation: untrained return thunk Feb 23 01:36:01 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 23 01:36:01 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 23 01:36:01 localhost kernel: Freeing SMP alternatives memory: 36K Feb 23 01:36:01 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Feb 23 01:36:01 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Feb 23 01:36:01 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 23 01:36:01 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 23 01:36:01 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Feb 23 01:36:01 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Feb 23 01:36:01 localhost kernel: ... version: 0 Feb 23 01:36:01 localhost kernel: ... bit width: 48 Feb 23 01:36:01 localhost kernel: ... generic registers: 6 Feb 23 01:36:01 localhost kernel: ... value mask: 0000ffffffffffff Feb 23 01:36:01 localhost kernel: ... max period: 00007fffffffffff Feb 23 01:36:01 localhost kernel: ... fixed-purpose events: 0 Feb 23 01:36:01 localhost kernel: ... event mask: 000000000000003f Feb 23 01:36:01 localhost kernel: rcu: Hierarchical SRCU implementation. Feb 23 01:36:01 localhost kernel: rcu: #011Max phase no-delay instances is 400. Feb 23 01:36:01 localhost kernel: smp: Bringing up secondary CPUs ... Feb 23 01:36:01 localhost kernel: x86: Booting SMP configuration: Feb 23 01:36:01 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Feb 23 01:36:01 localhost kernel: smp: Brought up 1 node, 8 CPUs Feb 23 01:36:01 localhost kernel: smpboot: Max logical packages: 8 Feb 23 01:36:01 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Feb 23 01:36:01 localhost kernel: node 0 deferred pages initialised in 23ms Feb 23 01:36:01 localhost kernel: devtmpfs: initialized Feb 23 01:36:01 localhost kernel: x86/mm: Memory block size: 128MB Feb 23 01:36:01 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 23 01:36:01 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Feb 23 01:36:01 localhost kernel: pinctrl core: initialized pinctrl subsystem Feb 23 01:36:01 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 23 01:36:01 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Feb 23 01:36:01 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Feb 23 01:36:01 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Feb 23 01:36:01 localhost kernel: audit: initializing netlink subsys (disabled) Feb 23 01:36:01 localhost kernel: audit: type=2000 audit(1771828560.287:1): state=initialized audit_enabled=0 res=1 Feb 23 01:36:01 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Feb 23 01:36:01 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 23 01:36:01 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Feb 23 01:36:01 localhost kernel: cpuidle: using governor menu Feb 23 01:36:01 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Feb 23 01:36:01 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 23 01:36:01 localhost kernel: PCI: Using configuration type 1 for base access Feb 23 01:36:01 localhost kernel: PCI: Using configuration type 1 for extended access Feb 23 01:36:01 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 23 01:36:01 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Feb 23 01:36:01 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Feb 23 01:36:01 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Feb 23 01:36:01 localhost kernel: cryptd: max_cpu_qlen set to 1000 Feb 23 01:36:01 localhost kernel: ACPI: Added _OSI(Module Device) Feb 23 01:36:01 localhost kernel: ACPI: Added _OSI(Processor Device) Feb 23 01:36:01 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 23 01:36:01 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 23 01:36:01 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Feb 23 01:36:01 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Feb 23 01:36:01 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Feb 23 01:36:01 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 23 01:36:01 localhost kernel: ACPI: Interpreter enabled Feb 23 01:36:01 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Feb 23 01:36:01 localhost kernel: ACPI: Using IOAPIC for interrupt routing Feb 23 01:36:01 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 23 01:36:01 localhost kernel: PCI: Using E820 reservations for host bridge windows Feb 23 01:36:01 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Feb 23 01:36:01 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Feb 23 01:36:01 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Feb 23 01:36:01 localhost kernel: acpiphp: Slot [3] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [4] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [5] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [6] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [7] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [8] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [9] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [10] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [11] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [12] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [13] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [14] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [15] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [16] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [17] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [18] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [19] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [20] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [21] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [22] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [23] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [24] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [25] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [26] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [27] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [28] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [29] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [30] registered Feb 23 01:36:01 localhost kernel: acpiphp: Slot [31] registered Feb 23 01:36:01 localhost kernel: PCI host bridge to bus 0000:00 Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Feb 23 01:36:01 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Feb 23 01:36:01 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Feb 23 01:36:01 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Feb 23 01:36:01 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Feb 23 01:36:01 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 23 01:36:01 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 23 01:36:01 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 23 01:36:01 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 23 01:36:01 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Feb 23 01:36:01 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Feb 23 01:36:01 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Feb 23 01:36:01 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Feb 23 01:36:01 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 23 01:36:01 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Feb 23 01:36:01 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Feb 23 01:36:01 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Feb 23 01:36:01 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Feb 23 01:36:01 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Feb 23 01:36:01 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Feb 23 01:36:01 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Feb 23 01:36:01 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Feb 23 01:36:01 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Feb 23 01:36:01 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Feb 23 01:36:01 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Feb 23 01:36:01 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Feb 23 01:36:01 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Feb 23 01:36:01 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Feb 23 01:36:01 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Feb 23 01:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Feb 23 01:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Feb 23 01:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 23 01:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Feb 23 01:36:01 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Feb 23 01:36:01 localhost kernel: iommu: Default domain type: Translated Feb 23 01:36:01 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 23 01:36:01 localhost kernel: SCSI subsystem initialized Feb 23 01:36:01 localhost kernel: ACPI: bus type USB registered Feb 23 01:36:01 localhost kernel: usbcore: registered new interface driver usbfs Feb 23 01:36:01 localhost kernel: usbcore: registered new interface driver hub Feb 23 01:36:01 localhost kernel: usbcore: registered new device driver usb Feb 23 01:36:01 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Feb 23 01:36:01 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Feb 23 01:36:01 localhost kernel: PTP clock support registered Feb 23 01:36:01 localhost kernel: EDAC MC: Ver: 3.0.0 Feb 23 01:36:01 localhost kernel: NetLabel: Initializing Feb 23 01:36:01 localhost kernel: NetLabel: domain hash size = 128 Feb 23 01:36:01 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Feb 23 01:36:01 localhost kernel: NetLabel: unlabeled traffic allowed by default Feb 23 01:36:01 localhost kernel: PCI: Using ACPI for IRQ routing Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Feb 23 01:36:01 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 23 01:36:01 localhost kernel: vgaarb: loaded Feb 23 01:36:01 localhost kernel: clocksource: Switched to clocksource kvm-clock Feb 23 01:36:01 localhost kernel: VFS: Disk quotas dquot_6.6.0 Feb 23 01:36:01 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 23 01:36:01 localhost kernel: pnp: PnP ACPI init Feb 23 01:36:01 localhost kernel: pnp: PnP ACPI: found 5 devices Feb 23 01:36:01 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 23 01:36:01 localhost kernel: NET: Registered PF_INET protocol family Feb 23 01:36:01 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 23 01:36:01 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Feb 23 01:36:01 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 23 01:36:01 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 23 01:36:01 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Feb 23 01:36:01 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Feb 23 01:36:01 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Feb 23 01:36:01 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Feb 23 01:36:01 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Feb 23 01:36:01 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 23 01:36:01 localhost kernel: NET: Registered PF_XDP protocol family Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Feb 23 01:36:01 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Feb 23 01:36:01 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Feb 23 01:36:01 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 23 01:36:01 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Feb 23 01:36:01 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 34339 usecs Feb 23 01:36:01 localhost kernel: PCI: CLS 0 bytes, default 64 Feb 23 01:36:01 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Feb 23 01:36:01 localhost kernel: Trying to unpack rootfs image as initramfs... Feb 23 01:36:01 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Feb 23 01:36:01 localhost kernel: ACPI: bus type thunderbolt registered Feb 23 01:36:01 localhost kernel: Initialise system trusted keyrings Feb 23 01:36:01 localhost kernel: Key type blacklist registered Feb 23 01:36:01 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Feb 23 01:36:01 localhost kernel: zbud: loaded Feb 23 01:36:01 localhost kernel: integrity: Platform Keyring initialized Feb 23 01:36:01 localhost kernel: NET: Registered PF_ALG protocol family Feb 23 01:36:01 localhost kernel: xor: automatically using best checksumming function avx Feb 23 01:36:01 localhost kernel: Key type asymmetric registered Feb 23 01:36:01 localhost kernel: Asymmetric key parser 'x509' registered Feb 23 01:36:01 localhost kernel: Running certificate verification selftests Feb 23 01:36:01 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Feb 23 01:36:01 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Feb 23 01:36:01 localhost kernel: io scheduler mq-deadline registered Feb 23 01:36:01 localhost kernel: io scheduler kyber registered Feb 23 01:36:01 localhost kernel: io scheduler bfq registered Feb 23 01:36:01 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Feb 23 01:36:01 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Feb 23 01:36:01 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Feb 23 01:36:01 localhost kernel: ACPI: button: Power Button [PWRF] Feb 23 01:36:01 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Feb 23 01:36:01 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Feb 23 01:36:01 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Feb 23 01:36:01 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 23 01:36:01 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 23 01:36:01 localhost kernel: Non-volatile memory driver v1.3 Feb 23 01:36:01 localhost kernel: rdac: device handler registered Feb 23 01:36:01 localhost kernel: hp_sw: device handler registered Feb 23 01:36:01 localhost kernel: emc: device handler registered Feb 23 01:36:01 localhost kernel: alua: device handler registered Feb 23 01:36:01 localhost kernel: libphy: Fixed MDIO Bus: probed Feb 23 01:36:01 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Feb 23 01:36:01 localhost kernel: ehci-pci: EHCI PCI platform driver Feb 23 01:36:01 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Feb 23 01:36:01 localhost kernel: ohci-pci: OHCI PCI platform driver Feb 23 01:36:01 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Feb 23 01:36:01 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Feb 23 01:36:01 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Feb 23 01:36:01 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Feb 23 01:36:01 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Feb 23 01:36:01 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Feb 23 01:36:01 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Feb 23 01:36:01 localhost kernel: usb usb1: Product: UHCI Host Controller Feb 23 01:36:01 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Feb 23 01:36:01 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Feb 23 01:36:01 localhost kernel: hub 1-0:1.0: USB hub found Feb 23 01:36:01 localhost kernel: hub 1-0:1.0: 2 ports detected Feb 23 01:36:01 localhost kernel: usbcore: registered new interface driver usbserial_generic Feb 23 01:36:01 localhost kernel: usbserial: USB Serial support registered for generic Feb 23 01:36:01 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Feb 23 01:36:01 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 23 01:36:01 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 23 01:36:01 localhost kernel: mousedev: PS/2 mouse device common for all mice Feb 23 01:36:01 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Feb 23 01:36:01 localhost kernel: rtc_cmos 00:04: registered as rtc0 Feb 23 01:36:01 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Feb 23 01:36:01 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-23T06:36:00 UTC (1771828560) Feb 23 01:36:01 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Feb 23 01:36:01 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Feb 23 01:36:01 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Feb 23 01:36:01 localhost kernel: usbcore: registered new interface driver usbhid Feb 23 01:36:01 localhost kernel: usbhid: USB HID core driver Feb 23 01:36:01 localhost kernel: drop_monitor: Initializing network drop monitor service Feb 23 01:36:01 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Feb 23 01:36:01 localhost kernel: Initializing XFRM netlink socket Feb 23 01:36:01 localhost kernel: NET: Registered PF_INET6 protocol family Feb 23 01:36:01 localhost kernel: Segment Routing with IPv6 Feb 23 01:36:01 localhost kernel: NET: Registered PF_PACKET protocol family Feb 23 01:36:01 localhost kernel: mpls_gso: MPLS GSO support Feb 23 01:36:01 localhost kernel: IPI shorthand broadcast: enabled Feb 23 01:36:01 localhost kernel: AVX2 version of gcm_enc/dec engaged. Feb 23 01:36:01 localhost kernel: AES CTR mode by8 optimization enabled Feb 23 01:36:01 localhost kernel: sched_clock: Marking stable (728017336, 177516314)->(1032399249, -126865599) Feb 23 01:36:01 localhost kernel: registered taskstats version 1 Feb 23 01:36:01 localhost kernel: Loading compiled-in X.509 certificates Feb 23 01:36:01 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Feb 23 01:36:01 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Feb 23 01:36:01 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Feb 23 01:36:01 localhost kernel: zswap: loaded using pool lzo/zbud Feb 23 01:36:01 localhost kernel: page_owner is disabled Feb 23 01:36:01 localhost kernel: Key type big_key registered Feb 23 01:36:01 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Feb 23 01:36:01 localhost kernel: Freeing initrd memory: 74232K Feb 23 01:36:01 localhost kernel: Key type encrypted registered Feb 23 01:36:01 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Feb 23 01:36:01 localhost kernel: Loading compiled-in module X.509 certificates Feb 23 01:36:01 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Feb 23 01:36:01 localhost kernel: ima: Allocated hash algorithm: sha256 Feb 23 01:36:01 localhost kernel: ima: No architecture policies found Feb 23 01:36:01 localhost kernel: evm: Initialising EVM extended attributes: Feb 23 01:36:01 localhost kernel: evm: security.selinux Feb 23 01:36:01 localhost kernel: evm: security.SMACK64 (disabled) Feb 23 01:36:01 localhost kernel: evm: security.SMACK64EXEC (disabled) Feb 23 01:36:01 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Feb 23 01:36:01 localhost kernel: evm: security.SMACK64MMAP (disabled) Feb 23 01:36:01 localhost kernel: evm: security.apparmor (disabled) Feb 23 01:36:01 localhost kernel: evm: security.ima Feb 23 01:36:01 localhost kernel: evm: security.capability Feb 23 01:36:01 localhost kernel: evm: HMAC attrs: 0x1 Feb 23 01:36:01 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Feb 23 01:36:01 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Feb 23 01:36:01 localhost kernel: usb 1-1: Product: QEMU USB Tablet Feb 23 01:36:01 localhost kernel: usb 1-1: Manufacturer: QEMU Feb 23 01:36:01 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Feb 23 01:36:01 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Feb 23 01:36:01 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Feb 23 01:36:01 localhost kernel: Freeing unused decrypted memory: 2036K Feb 23 01:36:01 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Feb 23 01:36:01 localhost kernel: Write protecting the kernel read-only data: 26624k Feb 23 01:36:01 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Feb 23 01:36:01 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Feb 23 01:36:01 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Feb 23 01:36:01 localhost kernel: Run /init as init process Feb 23 01:36:01 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 23 01:36:01 localhost systemd[1]: Detected virtualization kvm. Feb 23 01:36:01 localhost systemd[1]: Detected architecture x86-64. Feb 23 01:36:01 localhost systemd[1]: Running in initrd. Feb 23 01:36:01 localhost systemd[1]: No hostname configured, using default hostname. Feb 23 01:36:01 localhost systemd[1]: Hostname set to . Feb 23 01:36:01 localhost systemd[1]: Initializing machine ID from VM UUID. Feb 23 01:36:01 localhost systemd[1]: Queued start job for default target Initrd Default Target. Feb 23 01:36:01 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Feb 23 01:36:01 localhost systemd[1]: Reached target Local Encrypted Volumes. Feb 23 01:36:01 localhost systemd[1]: Reached target Initrd /usr File System. Feb 23 01:36:01 localhost systemd[1]: Reached target Local File Systems. Feb 23 01:36:01 localhost systemd[1]: Reached target Path Units. Feb 23 01:36:01 localhost systemd[1]: Reached target Slice Units. Feb 23 01:36:01 localhost systemd[1]: Reached target Swaps. Feb 23 01:36:01 localhost systemd[1]: Reached target Timer Units. Feb 23 01:36:01 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Feb 23 01:36:01 localhost systemd[1]: Listening on Journal Socket (/dev/log). Feb 23 01:36:01 localhost systemd[1]: Listening on Journal Socket. Feb 23 01:36:01 localhost systemd[1]: Listening on udev Control Socket. Feb 23 01:36:01 localhost systemd[1]: Listening on udev Kernel Socket. Feb 23 01:36:01 localhost systemd[1]: Reached target Socket Units. Feb 23 01:36:01 localhost systemd[1]: Starting Create List of Static Device Nodes... Feb 23 01:36:01 localhost systemd[1]: Starting Journal Service... Feb 23 01:36:01 localhost systemd[1]: Starting Load Kernel Modules... Feb 23 01:36:01 localhost systemd[1]: Starting Create System Users... Feb 23 01:36:01 localhost systemd[1]: Starting Setup Virtual Console... Feb 23 01:36:01 localhost systemd[1]: Finished Create List of Static Device Nodes. Feb 23 01:36:01 localhost systemd-journald[283]: Journal started Feb 23 01:36:01 localhost systemd-journald[283]: Runtime Journal (/run/log/journal/bdcaa433cfc7450a99abf0985ab59447) is 8.0M, max 314.7M, 306.7M free. Feb 23 01:36:01 localhost systemd-modules-load[284]: Module 'msr' is built in Feb 23 01:36:01 localhost systemd[1]: Started Journal Service. Feb 23 01:36:01 localhost systemd[1]: Finished Load Kernel Modules. Feb 23 01:36:01 localhost systemd[1]: Finished Setup Virtual Console. Feb 23 01:36:01 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Feb 23 01:36:01 localhost systemd[1]: Starting dracut cmdline hook... Feb 23 01:36:01 localhost systemd[1]: Starting Apply Kernel Variables... Feb 23 01:36:01 localhost systemd-sysusers[285]: Creating group 'sgx' with GID 997. Feb 23 01:36:01 localhost systemd-sysusers[285]: Creating group 'users' with GID 100. Feb 23 01:36:01 localhost systemd-sysusers[285]: Creating group 'dbus' with GID 81. Feb 23 01:36:01 localhost systemd-sysusers[285]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Feb 23 01:36:01 localhost systemd[1]: Finished Create System Users. Feb 23 01:36:01 localhost systemd[1]: Finished Apply Kernel Variables. Feb 23 01:36:01 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Feb 23 01:36:01 localhost systemd[1]: Starting Create Volatile Files and Directories... Feb 23 01:36:01 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Feb 23 01:36:01 localhost dracut-cmdline[288]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Feb 23 01:36:01 localhost dracut-cmdline[288]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Feb 23 01:36:01 localhost systemd[1]: Finished Create Volatile Files and Directories. Feb 23 01:36:01 localhost systemd[1]: Finished dracut cmdline hook. Feb 23 01:36:01 localhost systemd[1]: Starting dracut pre-udev hook... Feb 23 01:36:01 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 23 01:36:01 localhost kernel: device-mapper: uevent: version 1.0.3 Feb 23 01:36:01 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Feb 23 01:36:01 localhost kernel: RPC: Registered named UNIX socket transport module. Feb 23 01:36:01 localhost kernel: RPC: Registered udp transport module. Feb 23 01:36:01 localhost kernel: RPC: Registered tcp transport module. Feb 23 01:36:01 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Feb 23 01:36:01 localhost rpc.statd[406]: Version 2.5.4 starting Feb 23 01:36:01 localhost rpc.statd[406]: Initializing NSM state Feb 23 01:36:01 localhost rpc.idmapd[411]: Setting log level to 0 Feb 23 01:36:01 localhost systemd[1]: Finished dracut pre-udev hook. Feb 23 01:36:01 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 23 01:36:01 localhost systemd-udevd[424]: Using default interface naming scheme 'rhel-9.0'. Feb 23 01:36:01 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 23 01:36:01 localhost systemd[1]: Starting dracut pre-trigger hook... Feb 23 01:36:01 localhost systemd[1]: Finished dracut pre-trigger hook. Feb 23 01:36:01 localhost systemd[1]: Starting Coldplug All udev Devices... Feb 23 01:36:01 localhost systemd[1]: Finished Coldplug All udev Devices. Feb 23 01:36:01 localhost systemd[1]: Reached target System Initialization. Feb 23 01:36:01 localhost systemd[1]: Reached target Basic System. Feb 23 01:36:01 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Feb 23 01:36:01 localhost systemd[1]: Reached target Network. Feb 23 01:36:01 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Feb 23 01:36:01 localhost systemd[1]: Starting dracut initqueue hook... Feb 23 01:36:01 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Feb 23 01:36:01 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 23 01:36:01 localhost kernel: GPT:20971519 != 838860799 Feb 23 01:36:01 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Feb 23 01:36:01 localhost kernel: GPT:20971519 != 838860799 Feb 23 01:36:01 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Feb 23 01:36:01 localhost kernel: vda: vda1 vda2 vda3 vda4 Feb 23 01:36:01 localhost kernel: scsi host0: ata_piix Feb 23 01:36:01 localhost kernel: scsi host1: ata_piix Feb 23 01:36:01 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Feb 23 01:36:01 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Feb 23 01:36:02 localhost systemd-udevd[426]: Network interface NamePolicy= disabled on kernel command line. Feb 23 01:36:02 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Feb 23 01:36:02 localhost systemd[1]: Reached target Initrd Root Device. Feb 23 01:36:02 localhost kernel: ata1: found unknown device (class 0) Feb 23 01:36:02 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Feb 23 01:36:02 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Feb 23 01:36:02 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Feb 23 01:36:02 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Feb 23 01:36:02 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 23 01:36:02 localhost systemd[1]: Finished dracut initqueue hook. Feb 23 01:36:02 localhost systemd[1]: Reached target Preparation for Remote File Systems. Feb 23 01:36:02 localhost systemd[1]: Reached target Remote Encrypted Volumes. Feb 23 01:36:02 localhost systemd[1]: Reached target Remote File Systems. Feb 23 01:36:02 localhost systemd[1]: Starting dracut pre-mount hook... Feb 23 01:36:02 localhost systemd[1]: Finished dracut pre-mount hook. Feb 23 01:36:02 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Feb 23 01:36:02 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system. Feb 23 01:36:02 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Feb 23 01:36:02 localhost systemd[1]: Mounting /sysroot... Feb 23 01:36:02 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Feb 23 01:36:02 localhost kernel: XFS (vda4): Mounting V5 Filesystem Feb 23 01:36:02 localhost kernel: XFS (vda4): Ending clean mount Feb 23 01:36:02 localhost systemd[1]: Mounted /sysroot. Feb 23 01:36:02 localhost systemd[1]: Reached target Initrd Root File System. Feb 23 01:36:02 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Feb 23 01:36:02 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Feb 23 01:36:02 localhost systemd[1]: Reached target Initrd File Systems. Feb 23 01:36:02 localhost systemd[1]: Reached target Initrd Default Target. Feb 23 01:36:02 localhost systemd[1]: Starting dracut mount hook... Feb 23 01:36:02 localhost systemd[1]: Finished dracut mount hook. Feb 23 01:36:02 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Feb 23 01:36:02 localhost rpc.idmapd[411]: exiting on signal 15 Feb 23 01:36:02 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Feb 23 01:36:02 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Feb 23 01:36:02 localhost systemd[1]: Stopped target Network. Feb 23 01:36:02 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Feb 23 01:36:02 localhost systemd[1]: Stopped target Timer Units. Feb 23 01:36:02 localhost systemd[1]: dbus.socket: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Feb 23 01:36:02 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Feb 23 01:36:02 localhost systemd[1]: Stopped target Initrd Default Target. Feb 23 01:36:02 localhost systemd[1]: Stopped target Basic System. Feb 23 01:36:02 localhost systemd[1]: Stopped target Initrd Root Device. Feb 23 01:36:02 localhost systemd[1]: Stopped target Initrd /usr File System. Feb 23 01:36:02 localhost systemd[1]: Stopped target Path Units. Feb 23 01:36:02 localhost systemd[1]: Stopped target Remote File Systems. Feb 23 01:36:02 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Feb 23 01:36:02 localhost systemd[1]: Stopped target Slice Units. Feb 23 01:36:02 localhost systemd[1]: Stopped target Socket Units. Feb 23 01:36:02 localhost systemd[1]: Stopped target System Initialization. Feb 23 01:36:02 localhost systemd[1]: Stopped target Local File Systems. Feb 23 01:36:02 localhost systemd[1]: Stopped target Swaps. Feb 23 01:36:02 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped dracut mount hook. Feb 23 01:36:02 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped dracut pre-mount hook. Feb 23 01:36:02 localhost systemd[1]: Stopped target Local Encrypted Volumes. Feb 23 01:36:02 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Feb 23 01:36:02 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped dracut initqueue hook. Feb 23 01:36:02 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 23 01:36:02 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped Load Kernel Modules. Feb 23 01:36:02 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped Create Volatile Files and Directories. Feb 23 01:36:02 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped Coldplug All udev Devices. Feb 23 01:36:02 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped dracut pre-trigger hook. Feb 23 01:36:02 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Feb 23 01:36:02 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped Setup Virtual Console. Feb 23 01:36:02 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Feb 23 01:36:02 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Closed udev Control Socket. Feb 23 01:36:02 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Closed udev Kernel Socket. Feb 23 01:36:02 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped dracut pre-udev hook. Feb 23 01:36:02 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 23 01:36:02 localhost systemd[1]: Stopped dracut cmdline hook. Feb 23 01:36:02 localhost systemd[1]: Starting Cleanup udev Database... Feb 23 01:36:03 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 23 01:36:03 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Feb 23 01:36:03 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 23 01:36:03 localhost systemd[1]: Stopped Create List of Static Device Nodes. Feb 23 01:36:03 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Feb 23 01:36:03 localhost systemd[1]: Stopped Create System Users. Feb 23 01:36:03 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 23 01:36:03 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Feb 23 01:36:03 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 23 01:36:03 localhost systemd[1]: Finished Cleanup udev Database. Feb 23 01:36:03 localhost systemd[1]: Reached target Switch Root. Feb 23 01:36:03 localhost systemd[1]: Starting Switch Root... Feb 23 01:36:03 localhost systemd[1]: Switching root. Feb 23 01:36:03 localhost systemd-journald[283]: Journal stopped Feb 23 01:36:03 localhost systemd-journald[283]: Received SIGTERM from PID 1 (systemd). Feb 23 01:36:03 localhost kernel: audit: type=1404 audit(1771828563.167:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Feb 23 01:36:03 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 01:36:03 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 01:36:03 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 01:36:03 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 01:36:03 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 01:36:03 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 01:36:03 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 01:36:03 localhost kernel: audit: type=1403 audit(1771828563.299:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 23 01:36:03 localhost systemd[1]: Successfully loaded SELinux policy in 137.357ms. Feb 23 01:36:03 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 33.289ms. Feb 23 01:36:03 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 23 01:36:03 localhost systemd[1]: Detected virtualization kvm. Feb 23 01:36:03 localhost systemd[1]: Detected architecture x86-64. Feb 23 01:36:03 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 01:36:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 01:36:03 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 23 01:36:03 localhost systemd[1]: Stopped Switch Root. Feb 23 01:36:03 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 23 01:36:03 localhost systemd[1]: Created slice Slice /system/getty. Feb 23 01:36:03 localhost systemd[1]: Created slice Slice /system/modprobe. Feb 23 01:36:03 localhost systemd[1]: Created slice Slice /system/serial-getty. Feb 23 01:36:03 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Feb 23 01:36:03 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Feb 23 01:36:03 localhost systemd[1]: Created slice User and Session Slice. Feb 23 01:36:03 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Feb 23 01:36:03 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Feb 23 01:36:03 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Feb 23 01:36:03 localhost systemd[1]: Reached target Local Encrypted Volumes. Feb 23 01:36:03 localhost systemd[1]: Stopped target Switch Root. Feb 23 01:36:03 localhost systemd[1]: Stopped target Initrd File Systems. Feb 23 01:36:03 localhost systemd[1]: Stopped target Initrd Root File System. Feb 23 01:36:03 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Feb 23 01:36:03 localhost systemd[1]: Reached target Path Units. Feb 23 01:36:03 localhost systemd[1]: Reached target rpc_pipefs.target. Feb 23 01:36:03 localhost systemd[1]: Reached target Slice Units. Feb 23 01:36:03 localhost systemd[1]: Reached target Swaps. Feb 23 01:36:03 localhost systemd[1]: Reached target Local Verity Protected Volumes. Feb 23 01:36:03 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Feb 23 01:36:03 localhost systemd[1]: Reached target RPC Port Mapper. Feb 23 01:36:03 localhost systemd[1]: Listening on Process Core Dump Socket. Feb 23 01:36:03 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Feb 23 01:36:03 localhost systemd[1]: Listening on udev Control Socket. Feb 23 01:36:03 localhost systemd[1]: Listening on udev Kernel Socket. Feb 23 01:36:03 localhost systemd[1]: Mounting Huge Pages File System... Feb 23 01:36:03 localhost systemd[1]: Mounting POSIX Message Queue File System... Feb 23 01:36:03 localhost systemd[1]: Mounting Kernel Debug File System... Feb 23 01:36:03 localhost systemd[1]: Mounting Kernel Trace File System... Feb 23 01:36:03 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Feb 23 01:36:03 localhost systemd[1]: Starting Create List of Static Device Nodes... Feb 23 01:36:03 localhost systemd[1]: Starting Load Kernel Module configfs... Feb 23 01:36:03 localhost systemd[1]: Starting Load Kernel Module drm... Feb 23 01:36:03 localhost systemd[1]: Starting Load Kernel Module fuse... Feb 23 01:36:03 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Feb 23 01:36:03 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 23 01:36:03 localhost systemd[1]: Stopped File System Check on Root Device. Feb 23 01:36:03 localhost systemd[1]: Stopped Journal Service. Feb 23 01:36:03 localhost kernel: fuse: init (API version 7.36) Feb 23 01:36:03 localhost systemd[1]: Starting Journal Service... Feb 23 01:36:03 localhost systemd[1]: Starting Load Kernel Modules... Feb 23 01:36:03 localhost systemd[1]: Starting Generate network units from Kernel command line... Feb 23 01:36:03 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Feb 23 01:36:03 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Feb 23 01:36:03 localhost systemd-journald[618]: Journal started Feb 23 01:36:03 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/c0212a8b024a111cfc61293864f36c87) is 8.0M, max 314.7M, 306.7M free. Feb 23 01:36:03 localhost systemd[1]: Queued start job for default target Multi-User System. Feb 23 01:36:03 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Feb 23 01:36:03 localhost systemd-modules-load[619]: Module 'msr' is built in Feb 23 01:36:04 localhost systemd[1]: Starting Coldplug All udev Devices... Feb 23 01:36:04 localhost systemd[1]: Started Journal Service. Feb 23 01:36:04 localhost systemd[1]: Mounted Huge Pages File System. Feb 23 01:36:04 localhost systemd[1]: Mounted POSIX Message Queue File System. Feb 23 01:36:04 localhost systemd[1]: Mounted Kernel Debug File System. Feb 23 01:36:04 localhost systemd[1]: Mounted Kernel Trace File System. Feb 23 01:36:04 localhost systemd[1]: Finished Create List of Static Device Nodes. Feb 23 01:36:04 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Feb 23 01:36:04 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 23 01:36:04 localhost systemd[1]: Finished Load Kernel Module configfs. Feb 23 01:36:04 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 23 01:36:04 localhost systemd[1]: Finished Load Kernel Module fuse. Feb 23 01:36:04 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Feb 23 01:36:04 localhost kernel: ACPI: bus type drm_connector registered Feb 23 01:36:04 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 23 01:36:04 localhost systemd[1]: Finished Load Kernel Module drm. Feb 23 01:36:04 localhost systemd[1]: Finished Load Kernel Modules. Feb 23 01:36:04 localhost systemd[1]: Finished Generate network units from Kernel command line. Feb 23 01:36:04 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Feb 23 01:36:04 localhost systemd[1]: Mounting FUSE Control File System... Feb 23 01:36:04 localhost systemd[1]: Mounting Kernel Configuration File System... Feb 23 01:36:04 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Feb 23 01:36:04 localhost systemd[1]: Starting Rebuild Hardware Database... Feb 23 01:36:04 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Feb 23 01:36:04 localhost systemd[1]: Starting Load/Save Random Seed... Feb 23 01:36:04 localhost systemd[1]: Starting Apply Kernel Variables... Feb 23 01:36:04 localhost systemd[1]: Starting Create System Users... Feb 23 01:36:04 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/c0212a8b024a111cfc61293864f36c87) is 8.0M, max 314.7M, 306.7M free. Feb 23 01:36:04 localhost systemd-journald[618]: Received client request to flush runtime journal. Feb 23 01:36:04 localhost systemd[1]: Mounted FUSE Control File System. Feb 23 01:36:04 localhost systemd[1]: Mounted Kernel Configuration File System. Feb 23 01:36:04 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Feb 23 01:36:04 localhost systemd[1]: Finished Apply Kernel Variables. Feb 23 01:36:04 localhost systemd[1]: Finished Load/Save Random Seed. Feb 23 01:36:04 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Feb 23 01:36:04 localhost systemd[1]: Finished Coldplug All udev Devices. Feb 23 01:36:04 localhost systemd-sysusers[631]: Creating group 'sgx' with GID 989. Feb 23 01:36:04 localhost systemd-sysusers[631]: Creating group 'systemd-oom' with GID 988. Feb 23 01:36:04 localhost systemd-sysusers[631]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Feb 23 01:36:04 localhost systemd[1]: Finished Create System Users. Feb 23 01:36:04 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Feb 23 01:36:04 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Feb 23 01:36:04 localhost systemd[1]: Reached target Preparation for Local File Systems. Feb 23 01:36:04 localhost systemd[1]: Set up automount EFI System Partition Automount. Feb 23 01:36:04 localhost systemd[1]: Finished Rebuild Hardware Database. Feb 23 01:36:04 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 23 01:36:04 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'. Feb 23 01:36:04 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 23 01:36:04 localhost systemd[1]: Starting Load Kernel Module configfs... Feb 23 01:36:04 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 23 01:36:04 localhost systemd[1]: Finished Load Kernel Module configfs. Feb 23 01:36:04 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Feb 23 01:36:04 localhost systemd-udevd[639]: Network interface NamePolicy= disabled on kernel command line. Feb 23 01:36:04 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Feb 23 01:36:04 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Feb 23 01:36:04 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Feb 23 01:36:04 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Feb 23 01:36:04 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Feb 23 01:36:04 localhost systemd-fsck[681]: fsck.fat 4.2 (2021-01-31) Feb 23 01:36:04 localhost systemd-fsck[681]: /dev/vda2: 12 files, 1782/51145 clusters Feb 23 01:36:04 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Feb 23 01:36:04 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Feb 23 01:36:04 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Feb 23 01:36:04 localhost kernel: Console: switching to colour dummy device 80x25 Feb 23 01:36:04 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Feb 23 01:36:04 localhost kernel: [drm] features: -context_init Feb 23 01:36:04 localhost kernel: [drm] number of scanouts: 1 Feb 23 01:36:04 localhost kernel: [drm] number of cap sets: 0 Feb 23 01:36:04 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Feb 23 01:36:04 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Feb 23 01:36:04 localhost kernel: Console: switching to colour frame buffer device 128x48 Feb 23 01:36:04 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Feb 23 01:36:04 localhost kernel: SVM: TSC scaling supported Feb 23 01:36:04 localhost kernel: kvm: Nested Virtualization enabled Feb 23 01:36:04 localhost kernel: SVM: kvm: Nested Paging enabled Feb 23 01:36:04 localhost kernel: SVM: LBR virtualization supported Feb 23 01:36:05 localhost systemd[1]: Mounting /boot... Feb 23 01:36:05 localhost kernel: XFS (vda3): Mounting V5 Filesystem Feb 23 01:36:05 localhost kernel: XFS (vda3): Ending clean mount Feb 23 01:36:05 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Feb 23 01:36:05 localhost systemd[1]: Mounted /boot. Feb 23 01:36:05 localhost systemd[1]: Mounting /boot/efi... Feb 23 01:36:05 localhost systemd[1]: Mounted /boot/efi. Feb 23 01:36:05 localhost systemd[1]: Reached target Local File Systems. Feb 23 01:36:05 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Feb 23 01:36:05 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Feb 23 01:36:05 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 23 01:36:05 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 23 01:36:05 localhost systemd[1]: Starting Automatic Boot Loader Update... Feb 23 01:36:05 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Feb 23 01:36:05 localhost systemd[1]: Starting Create Volatile Files and Directories... Feb 23 01:36:05 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 716 (bootctl) Feb 23 01:36:05 localhost systemd[1]: Starting File System Check on /dev/vda2... Feb 23 01:36:05 localhost systemd[1]: Finished File System Check on /dev/vda2. Feb 23 01:36:05 localhost systemd[1]: Mounting EFI System Partition Automount... Feb 23 01:36:05 localhost systemd[1]: Mounted EFI System Partition Automount. Feb 23 01:36:05 localhost systemd[1]: Finished Automatic Boot Loader Update. Feb 23 01:36:05 localhost systemd[1]: Finished Create Volatile Files and Directories. Feb 23 01:36:05 localhost systemd[1]: Starting Security Auditing Service... Feb 23 01:36:05 localhost systemd[1]: Starting RPC Bind... Feb 23 01:36:05 localhost systemd[1]: Starting Rebuild Journal Catalog... Feb 23 01:36:05 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Feb 23 01:36:05 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable) Feb 23 01:36:05 localhost systemd[1]: Finished Rebuild Journal Catalog. Feb 23 01:36:05 localhost systemd[1]: Started RPC Bind. Feb 23 01:36:05 localhost augenrules[730]: /sbin/augenrules: No change Feb 23 01:36:05 localhost augenrules[740]: No rules Feb 23 01:36:05 localhost augenrules[740]: enabled 1 Feb 23 01:36:05 localhost augenrules[740]: failure 1 Feb 23 01:36:05 localhost augenrules[740]: pid 725 Feb 23 01:36:05 localhost augenrules[740]: rate_limit 0 Feb 23 01:36:05 localhost augenrules[740]: backlog_limit 8192 Feb 23 01:36:05 localhost augenrules[740]: lost 0 Feb 23 01:36:05 localhost augenrules[740]: backlog 0 Feb 23 01:36:05 localhost augenrules[740]: backlog_wait_time 60000 Feb 23 01:36:05 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 23 01:36:05 localhost augenrules[740]: enabled 1 Feb 23 01:36:05 localhost augenrules[740]: failure 1 Feb 23 01:36:05 localhost augenrules[740]: pid 725 Feb 23 01:36:05 localhost augenrules[740]: rate_limit 0 Feb 23 01:36:05 localhost augenrules[740]: backlog_limit 8192 Feb 23 01:36:05 localhost augenrules[740]: lost 0 Feb 23 01:36:05 localhost augenrules[740]: backlog 3 Feb 23 01:36:05 localhost augenrules[740]: backlog_wait_time 60000 Feb 23 01:36:05 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 23 01:36:05 localhost augenrules[740]: enabled 1 Feb 23 01:36:05 localhost augenrules[740]: failure 1 Feb 23 01:36:05 localhost augenrules[740]: pid 725 Feb 23 01:36:05 localhost augenrules[740]: rate_limit 0 Feb 23 01:36:05 localhost augenrules[740]: backlog_limit 8192 Feb 23 01:36:05 localhost augenrules[740]: lost 0 Feb 23 01:36:05 localhost augenrules[740]: backlog 0 Feb 23 01:36:05 localhost augenrules[740]: backlog_wait_time 60000 Feb 23 01:36:05 localhost augenrules[740]: backlog_wait_time_actual 0 Feb 23 01:36:05 localhost systemd[1]: Started Security Auditing Service. Feb 23 01:36:05 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Feb 23 01:36:05 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Feb 23 01:36:05 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Feb 23 01:36:05 localhost systemd[1]: Starting Update is Completed... Feb 23 01:36:05 localhost systemd[1]: Finished Update is Completed. Feb 23 01:36:05 localhost systemd[1]: Reached target System Initialization. Feb 23 01:36:05 localhost systemd[1]: Started dnf makecache --timer. Feb 23 01:36:05 localhost systemd[1]: Started Daily rotation of log files. Feb 23 01:36:05 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Feb 23 01:36:05 localhost systemd[1]: Reached target Timer Units. Feb 23 01:36:05 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Feb 23 01:36:05 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Feb 23 01:36:05 localhost systemd[1]: Reached target Socket Units. Feb 23 01:36:05 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Feb 23 01:36:05 localhost systemd[1]: Starting D-Bus System Message Bus... Feb 23 01:36:05 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 23 01:36:05 localhost systemd[1]: Started D-Bus System Message Bus. Feb 23 01:36:05 localhost systemd[1]: Reached target Basic System. Feb 23 01:36:05 localhost journal[750]: Ready Feb 23 01:36:05 localhost systemd[1]: Starting NTP client/server... Feb 23 01:36:05 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Feb 23 01:36:05 localhost systemd[1]: Started irqbalance daemon. Feb 23 01:36:05 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Feb 23 01:36:05 localhost systemd[1]: Starting System Logging Service... Feb 23 01:36:05 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 01:36:05 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 01:36:05 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 01:36:05 localhost systemd[1]: Reached target sshd-keygen.target. Feb 23 01:36:05 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Feb 23 01:36:05 localhost systemd[1]: Reached target User and Group Name Lookups. Feb 23 01:36:05 localhost systemd[1]: Starting User Login Management... Feb 23 01:36:05 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Feb 23 01:36:05 localhost rsyslogd[758]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="758" x-info="https://www.rsyslog.com"] start Feb 23 01:36:05 localhost rsyslogd[758]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Feb 23 01:36:05 localhost systemd[1]: Started System Logging Service. Feb 23 01:36:05 localhost systemd-logind[759]: New seat seat0. Feb 23 01:36:05 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button) Feb 23 01:36:05 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Feb 23 01:36:05 localhost systemd[1]: Started User Login Management. Feb 23 01:36:05 localhost chronyd[765]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 23 01:36:05 localhost chronyd[765]: Using right/UTC timezone to obtain leap second data Feb 23 01:36:05 localhost chronyd[765]: Loaded seccomp filter (level 2) Feb 23 01:36:05 localhost systemd[1]: Started NTP client/server. Feb 23 01:36:05 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 01:36:06 localhost cloud-init[769]: Cloud-init v. 22.1-9.el9 running 'init-local' at Mon, 23 Feb 2026 06:36:06 +0000. Up 6.32 seconds. Feb 23 01:36:06 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp4o17gt98.mount: Deactivated successfully. Feb 23 01:36:06 localhost systemd[1]: Starting Hostname Service... Feb 23 01:36:06 localhost systemd[1]: Started Hostname Service. Feb 23 01:36:06 localhost systemd-hostnamed[783]: Hostname set to (static) Feb 23 01:36:06 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Feb 23 01:36:06 localhost systemd[1]: Reached target Preparation for Network. Feb 23 01:36:06 localhost systemd[1]: Starting Network Manager... Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7061] NetworkManager (version 1.42.2-1.el9) is starting... (boot:7e1679c6-ea6b-4cb0-813d-ca6f65e53cae) Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7066] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Feb 23 01:36:06 localhost systemd[1]: Started Network Manager. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7113] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Feb 23 01:36:06 localhost systemd[1]: Reached target Network. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7205] manager[0x55ed99fff020]: monitoring kernel firmware directory '/lib/firmware'. Feb 23 01:36:06 localhost systemd[1]: Starting Network Manager Wait Online... Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7241] hostname: hostname: using hostnamed Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7241] hostname: static hostname changed from (none) to "np0005626463.novalocal" Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7253] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Feb 23 01:36:06 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Feb 23 01:36:06 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7384] manager[0x55ed99fff020]: rfkill: Wi-Fi hardware radio set enabled Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7385] manager[0x55ed99fff020]: rfkill: WWAN hardware radio set enabled Feb 23 01:36:06 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 23 01:36:06 localhost systemd[1]: Started GSSAPI Proxy Daemon. Feb 23 01:36:06 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7494] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7495] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7506] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7507] manager: Networking is enabled by state file Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7558] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7559] settings: Loaded settings plugin: keyfile (internal) Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7585] dhcp: init: Using DHCP client 'internal' Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7587] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7596] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 23 01:36:06 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7602] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7607] device (lo): Activation: starting connection 'lo' (8bdfeccc-b3ac-4c33-8351-8677ac367e4c) Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7614] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7616] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7650] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7654] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7655] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7657] device (eth0): carrier: link connected Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7660] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7666] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Feb 23 01:36:06 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 23 01:36:06 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Feb 23 01:36:06 localhost systemd[1]: Reached target NFS client services. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7698] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7702] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7703] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7706] manager: NetworkManager state is now CONNECTING Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7708] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7718] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Feb 23 01:36:06 localhost systemd[1]: Reached target Preparation for Remote File Systems. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7722] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 23 01:36:06 localhost systemd[1]: Reached target Remote File Systems. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7786] dhcp4 (eth0): state changed new lease, address=38.102.83.164 Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7790] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7812] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Feb 23 01:36:06 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Feb 23 01:36:06 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7988] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7991] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.7995] device (lo): Activation: successful, device activated. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.8000] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.8002] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.8004] manager: NetworkManager state is now CONNECTED_SITE Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.8006] device (eth0): Activation: successful, device activated. Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.8010] manager: NetworkManager state is now CONNECTED_GLOBAL Feb 23 01:36:06 localhost NetworkManager[788]: [1771828566.8012] manager: startup complete Feb 23 01:36:06 localhost systemd[1]: Finished Network Manager Wait Online. Feb 23 01:36:06 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Feb 23 01:36:07 localhost cloud-init[951]: Cloud-init v. 22.1-9.el9 running 'init' at Mon, 23 Feb 2026 06:36:07 +0000. Up 7.20 seconds. Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | eth0 | True | 38.102.83.164 | 255.255.255.0 | global | fa:16:3e:9a:b6:c6 | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | eth0 | True | fe80::f816:3eff:fe9a:b6c6/64 | . | link | fa:16:3e:9a:b6:c6 | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | lo | True | ::1/128 | . | host | . | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | 0 | 0.0.0.0 | 38.102.83.1 | 0.0.0.0 | eth0 | UG | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | 1 | 38.102.83.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | 2 | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 | eth0 | UGH | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | Route | Destination | Gateway | Interface | Flags | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: | 3 | multicast | :: | eth0 | U | Feb 23 01:36:07 localhost cloud-init[951]: ci-info: +-------+-------------+---------+-----------+-------+ Feb 23 01:36:07 localhost systemd[1]: Starting Authorization Manager... Feb 23 01:36:07 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 23 01:36:07 localhost polkitd[1033]: Started polkitd version 0.117 Feb 23 01:36:07 localhost systemd[1]: Started Authorization Manager. Feb 23 01:36:10 localhost cloud-init[951]: Generating public/private rsa key pair. Feb 23 01:36:10 localhost cloud-init[951]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Feb 23 01:36:10 localhost cloud-init[951]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Feb 23 01:36:10 localhost cloud-init[951]: The key fingerprint is: Feb 23 01:36:10 localhost cloud-init[951]: SHA256:pdNZ75QGpUNpcq8gaslC+2P1B6YHH1UKWMec5bQ/qws root@np0005626463.novalocal Feb 23 01:36:10 localhost cloud-init[951]: The key's randomart image is: Feb 23 01:36:10 localhost cloud-init[951]: +---[RSA 3072]----+ Feb 23 01:36:10 localhost cloud-init[951]: | o.oo++ | Feb 23 01:36:10 localhost cloud-init[951]: | . ooB=.. | Feb 23 01:36:10 localhost cloud-init[951]: | .==+o | Feb 23 01:36:10 localhost cloud-init[951]: | . .+.oo+.o | Feb 23 01:36:10 localhost cloud-init[951]: | . o oS.oo .=..| Feb 23 01:36:10 localhost cloud-init[951]: | o = o.+ .+ o| Feb 23 01:36:10 localhost cloud-init[951]: | + . * E .. | Feb 23 01:36:10 localhost cloud-init[951]: | + . + o . | Feb 23 01:36:10 localhost cloud-init[951]: | . . . . o. | Feb 23 01:36:10 localhost cloud-init[951]: +----[SHA256]-----+ Feb 23 01:36:10 localhost cloud-init[951]: Generating public/private ecdsa key pair. Feb 23 01:36:10 localhost cloud-init[951]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Feb 23 01:36:10 localhost cloud-init[951]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Feb 23 01:36:10 localhost cloud-init[951]: The key fingerprint is: Feb 23 01:36:10 localhost cloud-init[951]: SHA256:OD3aGn6FewPNOJKfrmSp+eMwt9pK7oB2rTS1kgG+Y2g root@np0005626463.novalocal Feb 23 01:36:10 localhost cloud-init[951]: The key's randomart image is: Feb 23 01:36:10 localhost cloud-init[951]: +---[ECDSA 256]---+ Feb 23 01:36:10 localhost cloud-init[951]: | | Feb 23 01:36:10 localhost cloud-init[951]: | | Feb 23 01:36:10 localhost cloud-init[951]: | . | Feb 23 01:36:10 localhost cloud-init[951]: |. . o | Feb 23 01:36:10 localhost cloud-init[951]: | . . .o.S= | Feb 23 01:36:10 localhost cloud-init[951]: |... = +==.+ | Feb 23 01:36:10 localhost cloud-init[951]: |oE.*+oBo.* | Feb 23 01:36:10 localhost cloud-init[951]: |+ +++@.+= o | Feb 23 01:36:10 localhost cloud-init[951]: | oB=O=.. . | Feb 23 01:36:10 localhost cloud-init[951]: +----[SHA256]-----+ Feb 23 01:36:10 localhost cloud-init[951]: Generating public/private ed25519 key pair. Feb 23 01:36:10 localhost cloud-init[951]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Feb 23 01:36:10 localhost cloud-init[951]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Feb 23 01:36:10 localhost cloud-init[951]: The key fingerprint is: Feb 23 01:36:10 localhost cloud-init[951]: SHA256:1oXKoehgbeTcKm6BaoZ08cczaBK1XI/qMeTUf9gvmKw root@np0005626463.novalocal Feb 23 01:36:10 localhost cloud-init[951]: The key's randomart image is: Feb 23 01:36:10 localhost cloud-init[951]: +--[ED25519 256]--+ Feb 23 01:36:10 localhost cloud-init[951]: | | Feb 23 01:36:10 localhost cloud-init[951]: | . . . | Feb 23 01:36:10 localhost cloud-init[951]: | + + + . . | Feb 23 01:36:10 localhost cloud-init[951]: | B B = = . | Feb 23 01:36:10 localhost cloud-init[951]: | .o & * S + | Feb 23 01:36:10 localhost cloud-init[951]: |.o.* X * o o | Feb 23 01:36:10 localhost cloud-init[951]: |+ o.* + + + . | Feb 23 01:36:10 localhost cloud-init[951]: |o+.. . + . . | Feb 23 01:36:10 localhost cloud-init[951]: |o.. E. . | Feb 23 01:36:10 localhost cloud-init[951]: +----[SHA256]-----+ Feb 23 01:36:10 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Feb 23 01:36:10 localhost systemd[1]: Reached target Cloud-config availability. Feb 23 01:36:10 localhost systemd[1]: Reached target Network is Online. Feb 23 01:36:10 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Feb 23 01:36:10 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Feb 23 01:36:10 localhost systemd[1]: Starting Crash recovery kernel arming... Feb 23 01:36:10 localhost systemd[1]: Starting Notify NFS peers of a restart... Feb 23 01:36:10 localhost systemd[1]: Starting OpenSSH server daemon... Feb 23 01:36:10 localhost sm-notify[1131]: Version 2.5.4 starting Feb 23 01:36:10 localhost systemd[1]: Starting Permit User Sessions... Feb 23 01:36:10 localhost systemd[1]: Started Notify NFS peers of a restart. Feb 23 01:36:10 localhost systemd[1]: Finished Permit User Sessions. Feb 23 01:36:10 localhost sshd[1132]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost systemd[1]: Started Command Scheduler. Feb 23 01:36:10 localhost systemd[1]: Started Getty on tty1. Feb 23 01:36:10 localhost systemd[1]: Started Serial Getty on ttyS0. Feb 23 01:36:10 localhost systemd[1]: Reached target Login Prompts. Feb 23 01:36:10 localhost systemd[1]: Started OpenSSH server daemon. Feb 23 01:36:10 localhost systemd[1]: Reached target Multi-User System. Feb 23 01:36:10 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Feb 23 01:36:10 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Feb 23 01:36:10 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Feb 23 01:36:10 localhost sshd[1146]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost sshd[1163]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost sshd[1176]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost kdumpctl[1137]: kdump: No kdump initial ramdisk found. Feb 23 01:36:10 localhost kdumpctl[1137]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Feb 23 01:36:10 localhost sshd[1193]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost sshd[1201]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost sshd[1220]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost sshd[1247]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost sshd[1265]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost cloud-init[1266]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Mon, 23 Feb 2026 06:36:10 +0000. Up 10.67 seconds. Feb 23 01:36:10 localhost sshd[1278]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:10 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Feb 23 01:36:10 localhost systemd[1]: Starting Execute cloud user/final scripts... Feb 23 01:36:10 localhost dracut[1434]: dracut-057-21.git20230214.el9 Feb 23 01:36:10 localhost cloud-init[1452]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Mon, 23 Feb 2026 06:36:10 +0000. Up 11.06 seconds. Feb 23 01:36:11 localhost cloud-init[1457]: ############################################################# Feb 23 01:36:11 localhost cloud-init[1464]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Feb 23 01:36:11 localhost dracut[1436]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Feb 23 01:36:11 localhost cloud-init[1472]: 256 SHA256:OD3aGn6FewPNOJKfrmSp+eMwt9pK7oB2rTS1kgG+Y2g root@np0005626463.novalocal (ECDSA) Feb 23 01:36:11 localhost cloud-init[1479]: 256 SHA256:1oXKoehgbeTcKm6BaoZ08cczaBK1XI/qMeTUf9gvmKw root@np0005626463.novalocal (ED25519) Feb 23 01:36:11 localhost cloud-init[1483]: 3072 SHA256:pdNZ75QGpUNpcq8gaslC+2P1B6YHH1UKWMec5bQ/qws root@np0005626463.novalocal (RSA) Feb 23 01:36:11 localhost cloud-init[1486]: -----END SSH HOST KEY FINGERPRINTS----- Feb 23 01:36:11 localhost cloud-init[1490]: ############################################################# Feb 23 01:36:11 localhost cloud-init[1452]: Cloud-init v. 22.1-9.el9 finished at Mon, 23 Feb 2026 06:36:11 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 11.32 seconds Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Feb 23 01:36:11 localhost systemd[1]: Reloading Network Manager... Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Feb 23 01:36:11 localhost NetworkManager[788]: [1771828571.2636] audit: op="reload" arg="0" pid=1592 uid=0 result="success" Feb 23 01:36:11 localhost NetworkManager[788]: [1771828571.2645] config: signal: SIGHUP (no changes from disk) Feb 23 01:36:11 localhost systemd[1]: Reloaded Network Manager. Feb 23 01:36:11 localhost systemd[1]: Finished Execute cloud user/final scripts. Feb 23 01:36:11 localhost dracut[1436]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Feb 23 01:36:11 localhost systemd[1]: Reached target Cloud-init target. Feb 23 01:36:11 localhost dracut[1436]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Feb 23 01:36:11 localhost dracut[1436]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Feb 23 01:36:11 localhost dracut[1436]: memstrack is not available Feb 23 01:36:11 localhost dracut[1436]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Feb 23 01:36:11 localhost chronyd[765]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org) Feb 23 01:36:11 localhost chronyd[765]: System clock TAI offset set to 37 seconds Feb 23 01:36:11 localhost dracut[1436]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Feb 23 01:36:11 localhost dracut[1436]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Feb 23 01:36:11 localhost dracut[1436]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Feb 23 01:36:12 localhost dracut[1436]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Feb 23 01:36:12 localhost dracut[1436]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Feb 23 01:36:12 localhost dracut[1436]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Feb 23 01:36:12 localhost dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Feb 23 01:36:12 localhost dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Feb 23 01:36:12 localhost dracut[1436]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Feb 23 01:36:12 localhost dracut[1436]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Feb 23 01:36:12 localhost dracut[1436]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Feb 23 01:36:12 localhost dracut[1436]: memstrack is not available Feb 23 01:36:12 localhost dracut[1436]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Feb 23 01:36:12 localhost dracut[1436]: *** Including module: systemd *** Feb 23 01:36:12 localhost dracut[1436]: *** Including module: systemd-initrd *** Feb 23 01:36:12 localhost dracut[1436]: *** Including module: i18n *** Feb 23 01:36:12 localhost dracut[1436]: No KEYMAP configured. Feb 23 01:36:12 localhost dracut[1436]: *** Including module: drm *** Feb 23 01:36:13 localhost dracut[1436]: *** Including module: prefixdevname *** Feb 23 01:36:13 localhost dracut[1436]: *** Including module: kernel-modules *** Feb 23 01:36:13 localhost dracut[1436]: *** Including module: kernel-modules-extra *** Feb 23 01:36:13 localhost dracut[1436]: *** Including module: qemu *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: fstab-sys *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: rootfs-block *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: terminfo *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: udev-rules *** Feb 23 01:36:14 localhost dracut[1436]: Skipping udev rule: 91-permissions.rules Feb 23 01:36:14 localhost dracut[1436]: Skipping udev rule: 80-drivers-modprobe.rules Feb 23 01:36:14 localhost dracut[1436]: *** Including module: virtiofs *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: dracut-systemd *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: usrmount *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: base *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: fs-lib *** Feb 23 01:36:14 localhost dracut[1436]: *** Including module: kdumpbase *** Feb 23 01:36:15 localhost dracut[1436]: *** Including module: microcode_ctl-fw_dir_override *** Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl module: mangling fw_dir Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel-06-2d-07" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel-06-4e-03" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel-06-4f-01" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel-06-55-04" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel-06-5e-03" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel-06-8c-01" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Feb 23 01:36:15 localhost dracut[1436]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Feb 23 01:36:15 localhost dracut[1436]: *** Including module: shutdown *** Feb 23 01:36:15 localhost dracut[1436]: *** Including module: squash *** Feb 23 01:36:15 localhost dracut[1436]: *** Including modules done *** Feb 23 01:36:15 localhost dracut[1436]: *** Installing kernel module dependencies *** Feb 23 01:36:16 localhost dracut[1436]: *** Installing kernel module dependencies done *** Feb 23 01:36:16 localhost dracut[1436]: *** Resolving executable dependencies *** Feb 23 01:36:16 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 23 01:36:17 localhost dracut[1436]: *** Resolving executable dependencies done *** Feb 23 01:36:17 localhost dracut[1436]: *** Hardlinking files *** Feb 23 01:36:17 localhost dracut[1436]: Mode: real Feb 23 01:36:17 localhost dracut[1436]: Files: 1099 Feb 23 01:36:17 localhost dracut[1436]: Linked: 3 files Feb 23 01:36:17 localhost dracut[1436]: Compared: 0 xattrs Feb 23 01:36:17 localhost dracut[1436]: Compared: 373 files Feb 23 01:36:17 localhost dracut[1436]: Saved: 61.04 KiB Feb 23 01:36:17 localhost dracut[1436]: Duration: 0.035691 seconds Feb 23 01:36:17 localhost dracut[1436]: *** Hardlinking files done *** Feb 23 01:36:17 localhost dracut[1436]: Could not find 'strip'. Not stripping the initramfs. Feb 23 01:36:17 localhost dracut[1436]: *** Generating early-microcode cpio image *** Feb 23 01:36:17 localhost dracut[1436]: *** Constructing AuthenticAMD.bin *** Feb 23 01:36:17 localhost dracut[1436]: *** Store current command line parameters *** Feb 23 01:36:17 localhost dracut[1436]: Stored kernel commandline: Feb 23 01:36:17 localhost dracut[1436]: No dracut internal kernel commandline stored in the initramfs Feb 23 01:36:18 localhost dracut[1436]: *** Install squash loader *** Feb 23 01:36:18 localhost dracut[1436]: *** Squashing the files inside the initramfs *** Feb 23 01:36:19 localhost dracut[1436]: *** Squashing the files inside the initramfs done *** Feb 23 01:36:19 localhost dracut[1436]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Feb 23 01:36:19 localhost dracut[1436]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Feb 23 01:36:20 localhost kdumpctl[1137]: kdump: kexec: loaded kdump kernel Feb 23 01:36:20 localhost kdumpctl[1137]: kdump: Starting kdump: [OK] Feb 23 01:36:20 localhost systemd[1]: Finished Crash recovery kernel arming. Feb 23 01:36:20 localhost systemd[1]: Startup finished in 1.320s (kernel) + 2.026s (initrd) + 17.208s (userspace) = 20.555s. Feb 23 01:36:32 localhost sshd[4172]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:36:32 localhost systemd[1]: Created slice User Slice of UID 1000. Feb 23 01:36:32 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Feb 23 01:36:32 localhost systemd-logind[759]: New session 1 of user zuul. Feb 23 01:36:32 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Feb 23 01:36:32 localhost systemd[1]: Starting User Manager for UID 1000... Feb 23 01:36:32 localhost systemd[4176]: Queued start job for default target Main User Target. Feb 23 01:36:32 localhost systemd[4176]: Created slice User Application Slice. Feb 23 01:36:32 localhost systemd[4176]: Started Mark boot as successful after the user session has run 2 minutes. Feb 23 01:36:32 localhost systemd[4176]: Started Daily Cleanup of User's Temporary Directories. Feb 23 01:36:32 localhost systemd[4176]: Reached target Paths. Feb 23 01:36:32 localhost systemd[4176]: Reached target Timers. Feb 23 01:36:32 localhost systemd[4176]: Starting D-Bus User Message Bus Socket... Feb 23 01:36:32 localhost systemd[4176]: Starting Create User's Volatile Files and Directories... Feb 23 01:36:32 localhost systemd[4176]: Finished Create User's Volatile Files and Directories. Feb 23 01:36:32 localhost systemd[4176]: Listening on D-Bus User Message Bus Socket. Feb 23 01:36:32 localhost systemd[4176]: Reached target Sockets. Feb 23 01:36:32 localhost systemd[4176]: Reached target Basic System. Feb 23 01:36:32 localhost systemd[4176]: Reached target Main User Target. Feb 23 01:36:32 localhost systemd[4176]: Startup finished in 111ms. Feb 23 01:36:32 localhost systemd[1]: Started User Manager for UID 1000. Feb 23 01:36:32 localhost systemd[1]: Started Session 1 of User zuul. Feb 23 01:36:33 localhost python3[4228]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 01:36:36 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 23 01:36:42 localhost python3[4248]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 01:36:50 localhost python3[4302]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 01:36:51 localhost python3[4332]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Feb 23 01:36:54 localhost python3[4348]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:36:55 localhost python3[4362]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:36:56 localhost python3[4421]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:36:56 localhost python3[4462]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771828616.28678-387-259426370767621/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=75c3b09aedfa4a0eb967a11aba86ff70_id_rsa follow=False checksum=3856428e4c0cdf708f3b02cf6f4769559d121f25 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:36:58 localhost python3[4535]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:36:58 localhost python3[4576]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771828618.2974393-485-158436035931816/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=75c3b09aedfa4a0eb967a11aba86ff70_id_rsa.pub follow=False checksum=24c5085c987d798738c880bb8143c9f9cd19ae33 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:17 localhost chronyd[765]: Selected source 167.160.187.179 (2.rhel.pool.ntp.org) Feb 23 01:37:37 localhost python3[4605]: ansible-ping Invoked with data=pong Feb 23 01:37:39 localhost python3[4619]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 01:37:43 localhost python3[4672]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Feb 23 01:37:45 localhost python3[4694]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:45 localhost python3[4708]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:45 localhost python3[4722]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:47 localhost python3[4736]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:47 localhost python3[4750]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:47 localhost python3[4764]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:50 localhost python3[4780]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:51 localhost python3[4828]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:37:51 localhost python3[4871]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771828671.2630305-95-108966421959354/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:37:58 localhost python3[4899]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:37:59 localhost python3[4913]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:37:59 localhost python3[4927]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:37:59 localhost python3[4941]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:00 localhost python3[4955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:00 localhost python3[4969]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:00 localhost python3[4983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:00 localhost python3[4997]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:01 localhost python3[5011]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:01 localhost python3[5025]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:01 localhost python3[5039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:01 localhost python3[5053]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:02 localhost python3[5067]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:02 localhost python3[5081]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:02 localhost python3[5095]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:02 localhost python3[5109]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:03 localhost python3[5123]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:03 localhost python3[5137]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:03 localhost python3[5151]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:03 localhost python3[5165]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:04 localhost python3[5179]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:04 localhost python3[5193]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:04 localhost python3[5207]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:04 localhost python3[5221]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:05 localhost python3[5235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:05 localhost python3[5249]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:38:07 localhost python3[5266]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Feb 23 01:38:07 localhost systemd[1]: Starting Time & Date Service... Feb 23 01:38:07 localhost systemd[1]: Started Time & Date Service. Feb 23 01:38:07 localhost systemd-timedated[5268]: Changed time zone to 'UTC' (UTC). Feb 23 01:38:09 localhost python3[5287]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:38:10 localhost python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:38:10 localhost python3[5374]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771828689.974417-491-24273867821658/source _original_basename=tmp27944crw follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:38:11 localhost python3[5434]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:38:11 localhost python3[5475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771828691.467247-584-83037043803092/source _original_basename=tmp_9pjbo_8 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:38:13 localhost python3[5537]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:38:14 localhost python3[5580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771828693.6696973-723-110828000767333/source _original_basename=tmpdlis6age follow=False checksum=9313104c4584898a1afe992edc322b557e0f1f28 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:38:15 localhost python3[5608]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:38:15 localhost python3[5624]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:38:16 localhost python3[5674]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:38:17 localhost python3[5717]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771828696.5313103-852-14575786188836/source _original_basename=tmpi06nmjml follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:38:18 localhost python3[5748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-16c2-0802-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:38:29 localhost python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-16c2-0802-000000000024-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Feb 23 01:38:37 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 23 01:38:41 localhost python3[5786]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:38:47 localhost systemd[4176]: Starting Mark boot as successful... Feb 23 01:38:47 localhost systemd[4176]: Finished Mark boot as successful. Feb 23 01:39:00 localhost python3[5804]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:40:00 localhost systemd-logind[759]: Session 1 logged out. Waiting for processes to exit. Feb 23 01:40:07 localhost sshd[5807]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:40:46 localhost systemd[1]: Unmounting EFI System Partition Automount... Feb 23 01:40:46 localhost systemd[1]: efi.mount: Deactivated successfully. Feb 23 01:40:46 localhost systemd[1]: Unmounted EFI System Partition Automount. Feb 23 01:41:47 localhost systemd[4176]: Created slice User Background Tasks Slice. Feb 23 01:41:47 localhost systemd[4176]: Starting Cleanup of User's Temporary Files and Directories... Feb 23 01:41:47 localhost systemd[4176]: Finished Cleanup of User's Temporary Files and Directories. Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Feb 23 01:42:08 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Feb 23 01:42:08 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.6815] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Feb 23 01:42:08 localhost systemd-udevd[5813]: Network interface NamePolicy= disabled on kernel command line. Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.6968] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7007] settings: (eth1): created default wired connection 'Wired connection 1' Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7014] device (eth1): carrier: link connected Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7017] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7023] policy: auto-activating connection 'Wired connection 1' (14875e01-091c-3944-aefd-45256309e1cb) Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7030] device (eth1): Activation: starting connection 'Wired connection 1' (14875e01-091c-3944-aefd-45256309e1cb) Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7031] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7036] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7042] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Feb 23 01:42:08 localhost NetworkManager[788]: [1771828928.7048] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 23 01:42:09 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Feb 23 01:42:10 localhost sshd[5816]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:42:10 localhost systemd[1]: Started Session 3 of User zuul. Feb 23 01:42:10 localhost systemd-logind[759]: New session 3 of user zuul. Feb 23 01:42:10 localhost python3[5833]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-116e-582b-00000000039b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:42:23 localhost python3[5883]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:42:24 localhost python3[5926]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771828943.449911-435-249364340139678/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=9dbfb9b07f02d8db06baa922059ec27b6663d592 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:42:24 localhost python3[5956]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 01:42:24 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Feb 23 01:42:24 localhost systemd[1]: Stopped Network Manager Wait Online. Feb 23 01:42:24 localhost systemd[1]: Stopping Network Manager Wait Online... Feb 23 01:42:24 localhost systemd[1]: Stopping Network Manager... Feb 23 01:42:24 localhost NetworkManager[788]: [1771828944.7055] caught SIGTERM, shutting down normally. Feb 23 01:42:24 localhost NetworkManager[788]: [1771828944.7169] dhcp4 (eth0): canceled DHCP transaction Feb 23 01:42:24 localhost NetworkManager[788]: [1771828944.7170] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 23 01:42:24 localhost NetworkManager[788]: [1771828944.7170] dhcp4 (eth0): state changed no lease Feb 23 01:42:24 localhost NetworkManager[788]: [1771828944.7174] manager: NetworkManager state is now CONNECTING Feb 23 01:42:24 localhost NetworkManager[788]: [1771828944.7229] dhcp4 (eth1): canceled DHCP transaction Feb 23 01:42:24 localhost NetworkManager[788]: [1771828944.7230] dhcp4 (eth1): state changed no lease Feb 23 01:42:24 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 23 01:42:24 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 23 01:42:24 localhost NetworkManager[788]: [1771828944.7491] exiting (success) Feb 23 01:42:24 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Feb 23 01:42:24 localhost systemd[1]: Stopped Network Manager. Feb 23 01:42:24 localhost systemd[1]: NetworkManager.service: Consumed 1.839s CPU time. Feb 23 01:42:24 localhost systemd[1]: Starting Network Manager... Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.7922] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:7e1679c6-ea6b-4cb0-813d-ca6f65e53cae) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.7924] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.7942] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Feb 23 01:42:24 localhost systemd[1]: Started Network Manager. Feb 23 01:42:24 localhost systemd[1]: Starting Network Manager Wait Online... Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8006] manager[0x5640e0c73090]: monitoring kernel firmware directory '/lib/firmware'. Feb 23 01:42:24 localhost systemd[1]: Starting Hostname Service... Feb 23 01:42:24 localhost systemd[1]: Started Hostname Service. Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8804] hostname: hostname: using hostnamed Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8804] hostname: static hostname changed from (none) to "np0005626463.novalocal" Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8812] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8819] manager[0x5640e0c73090]: rfkill: Wi-Fi hardware radio set enabled Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8819] manager[0x5640e0c73090]: rfkill: WWAN hardware radio set enabled Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8861] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8862] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8863] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8864] manager: Networking is enabled by state file Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8873] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8873] settings: Loaded settings plugin: keyfile (internal) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8934] dhcp: init: Using DHCP client 'internal' Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8938] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8946] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8954] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8967] device (lo): Activation: starting connection 'lo' (8bdfeccc-b3ac-4c33-8351-8677ac367e4c) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8977] device (eth0): carrier: link connected Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8983] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8991] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.8992] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9001] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9012] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9020] device (eth1): carrier: link connected Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9026] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9034] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (14875e01-091c-3944-aefd-45256309e1cb) (indicated) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9034] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9041] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9052] device (eth1): Activation: starting connection 'Wired connection 1' (14875e01-091c-3944-aefd-45256309e1cb) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9080] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9099] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9101] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9104] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9110] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9112] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9115] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9118] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9125] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9128] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9140] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9142] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9179] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9185] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9196] device (lo): Activation: successful, device activated. Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9204] dhcp4 (eth0): state changed new lease, address=38.102.83.164 Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9209] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9360] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9412] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9416] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9423] manager: NetworkManager state is now CONNECTED_SITE Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9427] device (eth0): Activation: successful, device activated. Feb 23 01:42:24 localhost NetworkManager[5974]: [1771828944.9437] manager: NetworkManager state is now CONNECTED_GLOBAL Feb 23 01:42:25 localhost python3[6018]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-116e-582b-000000000120-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:42:35 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 23 01:42:54 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 23 01:43:09 localhost NetworkManager[5974]: [1771828989.8108] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Feb 23 01:43:09 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 23 01:43:09 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 23 01:43:09 localhost NetworkManager[5974]: [1771828989.8309] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Feb 23 01:43:09 localhost NetworkManager[5974]: [1771828989.8313] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Feb 23 01:43:09 localhost NetworkManager[5974]: [1771828989.8323] device (eth1): Activation: successful, device activated. Feb 23 01:43:09 localhost NetworkManager[5974]: [1771828989.8332] manager: startup complete Feb 23 01:43:09 localhost systemd[1]: Finished Network Manager Wait Online. Feb 23 01:43:19 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 23 01:43:25 localhost systemd[1]: session-3.scope: Deactivated successfully. Feb 23 01:43:25 localhost systemd[1]: session-3.scope: Consumed 1.457s CPU time. Feb 23 01:43:25 localhost systemd-logind[759]: Session 3 logged out. Waiting for processes to exit. Feb 23 01:43:25 localhost systemd-logind[759]: Removed session 3. Feb 23 01:43:44 localhost chronyd[765]: Selected source 23.159.16.194 (2.rhel.pool.ntp.org) Feb 23 01:44:15 localhost sshd[6057]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:44:32 localhost sshd[6059]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:44:32 localhost systemd-logind[759]: New session 4 of user zuul. Feb 23 01:44:32 localhost systemd[1]: Started Session 4 of User zuul. Feb 23 01:44:32 localhost python3[6110]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:44:32 localhost python3[6153]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771829072.3180947-628-278240485659165/source _original_basename=tmp9nrsig8v follow=False checksum=393f60ce964bed22379b4d5935087c828e1455a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:44:37 localhost systemd[1]: session-4.scope: Deactivated successfully. Feb 23 01:44:37 localhost systemd-logind[759]: Session 4 logged out. Waiting for processes to exit. Feb 23 01:44:37 localhost systemd-logind[759]: Removed session 4. Feb 23 01:48:40 localhost sshd[6170]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:49:42 localhost sshd[6172]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:50:43 localhost sshd[6175]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:51:03 localhost sshd[6179]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:51:03 localhost systemd[1]: Starting Cleanup of Temporary Directories... Feb 23 01:51:03 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Feb 23 01:51:03 localhost systemd[1]: Finished Cleanup of Temporary Directories. Feb 23 01:51:03 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Feb 23 01:51:03 localhost systemd-logind[759]: New session 5 of user zuul. Feb 23 01:51:03 localhost systemd[1]: Started Session 5 of User zuul. Feb 23 01:51:03 localhost python3[6200]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-8ad4-7d7f-00000000219f-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:51:15 localhost python3[6219]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:51:15 localhost python3[6235]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:51:15 localhost python3[6251]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:51:16 localhost python3[6267]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:51:16 localhost python3[6283]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:51:18 localhost python3[6331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:51:18 localhost python3[6374]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771829477.7201293-660-186436323015140/source _original_basename=tmprc1eb3c2 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:51:19 localhost python3[6404]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 01:51:19 localhost systemd[1]: Reloading. Feb 23 01:51:20 localhost systemd-rc-local-generator[6422]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 01:51:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 01:51:21 localhost python3[6451]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Feb 23 01:51:22 localhost python3[6467]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:51:23 localhost python3[6485]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:51:23 localhost python3[6503]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:51:23 localhost python3[6521]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:51:24 localhost python3[6538]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-8ad4-7d7f-0000000021a6-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:51:35 localhost python3[6558]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 01:51:38 localhost systemd[1]: session-5.scope: Deactivated successfully. Feb 23 01:51:38 localhost systemd[1]: session-5.scope: Consumed 4.056s CPU time. Feb 23 01:51:38 localhost systemd-logind[759]: Session 5 logged out. Waiting for processes to exit. Feb 23 01:51:38 localhost systemd-logind[759]: Removed session 5. Feb 23 01:51:46 localhost sshd[6563]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:52:30 localhost sshd[6567]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:52:30 localhost systemd-logind[759]: New session 6 of user zuul. Feb 23 01:52:30 localhost systemd[1]: Started Session 6 of User zuul. Feb 23 01:52:31 localhost systemd[1]: Starting RHSM dbus service... Feb 23 01:52:31 localhost systemd[1]: Started RHSM dbus service. Feb 23 01:52:31 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 23 01:52:31 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 23 01:52:31 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 23 01:52:31 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 23 01:52:34 localhost rhsm-service[6591]: INFO [subscription_manager.managerlib:90] Consumer created: np0005626463.novalocal (71d8a449-76d3-4525-90bb-1ec088bb454f) Feb 23 01:52:34 localhost subscription-manager[6591]: Registered system with identity: 71d8a449-76d3-4525-90bb-1ec088bb454f Feb 23 01:52:35 localhost rhsm-service[6591]: INFO [subscription_manager.entcertlib:131] certs updated: Feb 23 01:52:35 localhost rhsm-service[6591]: Total updates: 1 Feb 23 01:52:35 localhost rhsm-service[6591]: Found (local) serial# [] Feb 23 01:52:35 localhost rhsm-service[6591]: Expected (UEP) serial# [3819360702608339394] Feb 23 01:52:35 localhost rhsm-service[6591]: Added (new) Feb 23 01:52:35 localhost rhsm-service[6591]: [sn:3819360702608339394 ( Content Access,) @ /etc/pki/entitlement/3819360702608339394.pem] Feb 23 01:52:35 localhost rhsm-service[6591]: Deleted (rogue): Feb 23 01:52:35 localhost rhsm-service[6591]: Feb 23 01:52:35 localhost subscription-manager[6591]: Added subscription for 'Content Access' contract 'None' Feb 23 01:52:35 localhost subscription-manager[6591]: Added subscription for product ' Content Access' Feb 23 01:52:36 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Feb 23 01:52:36 localhost rhsm-service[6591]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Feb 23 01:52:36 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 01:52:36 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 01:52:36 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 01:52:37 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 01:52:37 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 01:52:43 localhost python3[6682]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-32b8-b7f7-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:52:45 localhost python3[6701]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 01:52:48 localhost sshd[6708]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:53:14 localhost setsebool[6778]: The virt_use_nfs policy boolean was changed to 1 by root Feb 23 01:53:14 localhost setsebool[6778]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Feb 23 01:53:22 localhost kernel: SELinux: Converting 406 SID table entries... Feb 23 01:53:22 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 01:53:22 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 01:53:22 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 01:53:22 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 01:53:22 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 01:53:22 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 01:53:22 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 01:53:35 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=3 res=1 Feb 23 01:53:35 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 01:53:35 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 01:53:35 localhost systemd[1]: Reloading. Feb 23 01:53:35 localhost systemd-rc-local-generator[7639]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 01:53:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 01:53:35 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 01:53:36 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 01:53:43 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 01:53:43 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 01:53:43 localhost systemd[1]: man-db-cache-update.service: Consumed 9.292s CPU time. Feb 23 01:53:43 localhost systemd[1]: run-r81631f5033c74fd0a27ced99f8b99169.service: Deactivated successfully. Feb 23 01:53:47 localhost sshd[18356]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:53:52 localhost sshd[18358]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:54:30 localhost podman[18376]: 2026-02-23 06:54:30.056621751 +0000 UTC m=+0.106063630 system refresh Feb 23 01:54:30 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 01:54:30 localhost systemd[4176]: Starting D-Bus User Message Bus... Feb 23 01:54:30 localhost dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Feb 23 01:54:30 localhost dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Feb 23 01:54:30 localhost systemd[4176]: Started D-Bus User Message Bus. Feb 23 01:54:30 localhost journal[18433]: Ready Feb 23 01:54:30 localhost systemd[4176]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Feb 23 01:54:30 localhost systemd[4176]: Created slice Slice /user. Feb 23 01:54:30 localhost systemd[4176]: podman-18416.scope: unit configures an IP firewall, but not running as root. Feb 23 01:54:30 localhost systemd[4176]: (This warning is only shown for the first unit using IP firewalling.) Feb 23 01:54:30 localhost systemd[4176]: Started podman-18416.scope. Feb 23 01:54:31 localhost systemd[4176]: Started podman-pause-a5e6aa7e.scope. Feb 23 01:54:33 localhost systemd[1]: session-6.scope: Deactivated successfully. Feb 23 01:54:33 localhost systemd[1]: session-6.scope: Consumed 49.807s CPU time. Feb 23 01:54:33 localhost systemd-logind[759]: Session 6 logged out. Waiting for processes to exit. Feb 23 01:54:33 localhost systemd-logind[759]: Removed session 6. Feb 23 01:54:43 localhost sshd[18436]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:54:47 localhost sshd[18438]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:54:47 localhost sshd[18441]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:54:47 localhost sshd[18439]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:54:48 localhost sshd[18442]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:54:48 localhost sshd[18440]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:54:52 localhost sshd[18448]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:54:52 localhost systemd-logind[759]: New session 7 of user zuul. Feb 23 01:54:52 localhost systemd[1]: Started Session 7 of User zuul. Feb 23 01:54:53 localhost python3[18465]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD0suk+oGhrLCF0TQEPuL+1TMMXZ4ZyjwmaIk09J9Zppa5UYl2p4E22RKwDBWJVKjp5+lVBFxSdpKjyFnuMgKyY= zuul@np0005626456.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:54:53 localhost python3[18481]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBD0suk+oGhrLCF0TQEPuL+1TMMXZ4ZyjwmaIk09J9Zppa5UYl2p4E22RKwDBWJVKjp5+lVBFxSdpKjyFnuMgKyY= zuul@np0005626456.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:54:55 localhost systemd[1]: session-7.scope: Deactivated successfully. Feb 23 01:54:55 localhost systemd-logind[759]: Session 7 logged out. Waiting for processes to exit. Feb 23 01:54:55 localhost systemd-logind[759]: Removed session 7. Feb 23 01:55:38 localhost sshd[18482]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:56:13 localhost sshd[18486]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:56:13 localhost systemd-logind[759]: New session 8 of user zuul. Feb 23 01:56:13 localhost systemd[1]: Started Session 8 of User zuul. Feb 23 01:56:13 localhost python3[18505]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 01:56:14 localhost python3[18521]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 23 01:56:16 localhost python3[18571]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:56:16 localhost python3[18614]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771829775.99336-133-41768577803103/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=75c3b09aedfa4a0eb967a11aba86ff70_id_rsa follow=False checksum=3856428e4c0cdf708f3b02cf6f4769559d121f25 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:56:17 localhost python3[18676]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:56:18 localhost python3[18719]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771829777.5856373-219-264846205402667/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=75c3b09aedfa4a0eb967a11aba86ff70_id_rsa.pub follow=False checksum=24c5085c987d798738c880bb8143c9f9cd19ae33 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:56:20 localhost python3[18749]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:56:21 localhost python3[18795]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:56:21 localhost python3[18811]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpd42ngr18 recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:56:22 localhost python3[18871]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:56:22 localhost python3[18887]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpr86mzc8q recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:56:24 localhost python3[18947]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 01:56:24 localhost python3[18963]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpglgfau3f recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 01:56:25 localhost systemd[1]: session-8.scope: Deactivated successfully. Feb 23 01:56:25 localhost systemd[1]: session-8.scope: Consumed 3.582s CPU time. Feb 23 01:56:25 localhost systemd-logind[759]: Session 8 logged out. Waiting for processes to exit. Feb 23 01:56:25 localhost systemd-logind[759]: Removed session 8. Feb 23 01:56:34 localhost sshd[18979]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:57:30 localhost sshd[18981]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:58:26 localhost sshd[18984]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:58:26 localhost sshd[18985]: main: sshd: ssh-rsa algorithm is disabled Feb 23 01:58:26 localhost systemd-logind[759]: New session 9 of user zuul. Feb 23 01:58:26 localhost systemd[1]: Started Session 9 of User zuul. Feb 23 01:58:26 localhost python3[19032]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 01:59:23 localhost sshd[19034]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:00:18 localhost sshd[19037]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:01:13 localhost sshd[19054]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:02:07 localhost sshd[19057]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:03:02 localhost sshd[19059]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:03:26 localhost systemd[1]: session-9.scope: Deactivated successfully. Feb 23 02:03:26 localhost systemd-logind[759]: Session 9 logged out. Waiting for processes to exit. Feb 23 02:03:26 localhost systemd-logind[759]: Removed session 9. Feb 23 02:03:41 localhost sshd[19062]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:04:02 localhost sshd[19064]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:05:02 localhost sshd[19066]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:06:01 localhost sshd[19069]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:06:57 localhost sshd[19071]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:07:49 localhost sshd[19074]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:08:42 localhost sshd[19076]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:09:37 localhost sshd[19079]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:09:57 localhost sshd[19082]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:09:57 localhost systemd-logind[759]: New session 10 of user zuul. Feb 23 02:09:57 localhost systemd[1]: Started Session 10 of User zuul. Feb 23 02:09:57 localhost python3[19099]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-669c-02d2-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:09:59 localhost python3[19119]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-669c-02d2-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:10:04 localhost python3[19138]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Feb 23 02:10:07 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:10:07 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:10:31 localhost sshd[19280]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:11:04 localhost python3[19299]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Feb 23 02:11:07 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:11:15 localhost python3[19498]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Feb 23 02:11:17 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:11:18 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:11:22 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:11:22 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:11:28 localhost sshd[19818]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:11:44 localhost python3[19835]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Feb 23 02:11:46 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:11:47 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:11:51 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:12:13 localhost python3[20111]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Feb 23 02:12:15 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:12:15 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:12:17 localhost sshd[20238]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:12:20 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:12:20 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:12:23 localhost sshd[20365]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:12:42 localhost python3[20393]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-000000000013-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:12:47 localhost python3[20412]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:12:59 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Feb 23 02:13:09 localhost kernel: SELinux: Converting 499 SID table entries... Feb 23 02:13:09 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 02:13:09 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 02:13:09 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 02:13:09 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 02:13:09 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 02:13:09 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 02:13:09 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 02:13:11 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=4 res=1 Feb 23 02:13:11 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 02:13:11 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 02:13:11 localhost systemd[1]: Reloading. Feb 23 02:13:11 localhost systemd-rc-local-generator[21247]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:13:11 localhost systemd-sysv-generator[21250]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:13:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:13:11 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 02:13:12 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 02:13:12 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 02:13:12 localhost systemd[1]: run-rb609a7387f0c424b8fcd92e8d489019d.service: Deactivated successfully. Feb 23 02:13:13 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:13:13 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 02:13:17 localhost sshd[21783]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:13:48 localhost sshd[21785]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:13:49 localhost python3[21803]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-000000000015-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:14:11 localhost sshd[21808]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:14:21 localhost python3[21826]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:14:22 localhost python3[21874]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:14:23 localhost python3[21917]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771830862.3150995-291-83123791134579/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=3358dfc6c6ce646155135d0cad900026cb34ba08 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:14:24 localhost python3[21947]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 23 02:14:24 localhost systemd-journald[618]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation. Feb 23 02:14:24 localhost systemd-journald[618]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 02:14:24 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 02:14:24 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 02:14:24 localhost python3[21968]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 23 02:14:25 localhost python3[21988]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 23 02:14:25 localhost python3[22008]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 23 02:14:25 localhost python3[22028]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Feb 23 02:14:28 localhost python3[22048]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 02:14:28 localhost systemd[1]: Starting LSB: Bring up/down networking... Feb 23 02:14:28 localhost network[22051]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 02:14:28 localhost network[22062]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 02:14:28 localhost network[22051]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:28 localhost network[22063]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:28 localhost network[22051]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Feb 23 02:14:28 localhost network[22064]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 02:14:28 localhost NetworkManager[5974]: [1771830868.2834] audit: op="connections-reload" pid=22092 uid=0 result="success" Feb 23 02:14:28 localhost network[22051]: Bringing up loopback interface: [ OK ] Feb 23 02:14:28 localhost NetworkManager[5974]: [1771830868.4730] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22180 uid=0 result="success" Feb 23 02:14:28 localhost network[22051]: Bringing up interface eth0: [ OK ] Feb 23 02:14:28 localhost systemd[1]: Started LSB: Bring up/down networking. Feb 23 02:14:28 localhost python3[22221]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 02:14:29 localhost systemd[1]: Starting Open vSwitch Database Unit... Feb 23 02:14:29 localhost chown[22225]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Feb 23 02:14:29 localhost ovs-ctl[22230]: /etc/openvswitch/conf.db does not exist ... (warning). Feb 23 02:14:29 localhost ovs-ctl[22230]: Creating empty database /etc/openvswitch/conf.db [ OK ] Feb 23 02:14:29 localhost ovs-ctl[22230]: Starting ovsdb-server [ OK ] Feb 23 02:14:29 localhost ovs-vsctl[22279]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Feb 23 02:14:29 localhost ovs-vsctl[22299]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"96b5bb93-7341-4ce6-9b93-6a5de566c711\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Feb 23 02:14:29 localhost ovs-ctl[22230]: Configuring Open vSwitch system IDs [ OK ] Feb 23 02:14:29 localhost ovs-ctl[22230]: Enabling remote OVSDB managers [ OK ] Feb 23 02:14:29 localhost ovs-vsctl[22305]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005626463.novalocal Feb 23 02:14:29 localhost systemd[1]: Started Open vSwitch Database Unit. Feb 23 02:14:29 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Feb 23 02:14:29 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Feb 23 02:14:29 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Feb 23 02:14:29 localhost kernel: openvswitch: Open vSwitch switching datapath Feb 23 02:14:29 localhost ovs-ctl[22349]: Inserting openvswitch module [ OK ] Feb 23 02:14:29 localhost ovs-ctl[22318]: Starting ovs-vswitchd [ OK ] Feb 23 02:14:29 localhost ovs-ctl[22318]: Enabling remote OVSDB managers [ OK ] Feb 23 02:14:29 localhost ovs-vsctl[22367]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005626463.novalocal Feb 23 02:14:29 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Feb 23 02:14:29 localhost systemd[1]: Starting Open vSwitch... Feb 23 02:14:29 localhost systemd[1]: Finished Open vSwitch. Feb 23 02:14:33 localhost python3[22385]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-00000000001a-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:14:34 localhost NetworkManager[5974]: [1771830874.5153] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22581 uid=0 result="success" Feb 23 02:14:34 localhost ifup[22582]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:34 localhost ifup[22583]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:34 localhost ifup[22584]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:34 localhost NetworkManager[5974]: [1771830874.5498] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22590 uid=0 result="success" Feb 23 02:14:34 localhost ovs-vsctl[22592]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:f0:80:57 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Feb 23 02:14:34 localhost kernel: device ovs-system entered promiscuous mode Feb 23 02:14:34 localhost NetworkManager[5974]: [1771830874.5790] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Feb 23 02:14:34 localhost kernel: Timeout policy base is empty Feb 23 02:14:34 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Feb 23 02:14:34 localhost systemd-udevd[22594]: Network interface NamePolicy= disabled on kernel command line. Feb 23 02:14:34 localhost kernel: device br-ex entered promiscuous mode Feb 23 02:14:34 localhost NetworkManager[5974]: [1771830874.6271] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Feb 23 02:14:34 localhost NetworkManager[5974]: [1771830874.6550] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22619 uid=0 result="success" Feb 23 02:14:34 localhost NetworkManager[5974]: [1771830874.6765] device (br-ex): carrier: link connected Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.7325] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22648 uid=0 result="success" Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.7779] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22663 uid=0 result="success" Feb 23 02:14:37 localhost NET[22688]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.8657] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.8739] dhcp4 (eth1): canceled DHCP transaction Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.8739] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.8739] dhcp4 (eth1): state changed no lease Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.8779] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22697 uid=0 result="success" Feb 23 02:14:37 localhost ifup[22698]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:37 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 23 02:14:37 localhost ifup[22699]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:37 localhost ifup[22701]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:37 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.9113] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22715 uid=0 result="success" Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.9558] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22725 uid=0 result="success" Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.9626] device (eth1): carrier: link connected Feb 23 02:14:37 localhost NetworkManager[5974]: [1771830877.9842] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22734 uid=0 result="success" Feb 23 02:14:38 localhost ipv6_wait_tentative[22746]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Feb 23 02:14:39 localhost ipv6_wait_tentative[22751]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Feb 23 02:14:40 localhost NetworkManager[5974]: [1771830880.0566] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22761 uid=0 result="success" Feb 23 02:14:40 localhost ovs-vsctl[22776]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Feb 23 02:14:40 localhost kernel: device eth1 entered promiscuous mode Feb 23 02:14:40 localhost NetworkManager[5974]: [1771830880.1325] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22783 uid=0 result="success" Feb 23 02:14:40 localhost ifup[22784]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:40 localhost ifup[22785]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:40 localhost ifup[22786]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:40 localhost NetworkManager[5974]: [1771830880.1641] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22792 uid=0 result="success" Feb 23 02:14:40 localhost NetworkManager[5974]: [1771830880.2090] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22802 uid=0 result="success" Feb 23 02:14:40 localhost ifup[22803]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:40 localhost ifup[22804]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:40 localhost ifup[22805]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:40 localhost NetworkManager[5974]: [1771830880.2426] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22811 uid=0 result="success" Feb 23 02:14:40 localhost ovs-vsctl[22814]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Feb 23 02:14:40 localhost kernel: device vlan20 entered promiscuous mode Feb 23 02:14:40 localhost NetworkManager[5974]: [1771830880.2851] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Feb 23 02:14:40 localhost systemd-udevd[22816]: Network interface NamePolicy= disabled on kernel command line. Feb 23 02:14:40 localhost NetworkManager[5974]: [1771830880.3118] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22825 uid=0 result="success" Feb 23 02:14:40 localhost NetworkManager[5974]: [1771830880.3338] device (vlan20): carrier: link connected Feb 23 02:14:43 localhost NetworkManager[5974]: [1771830883.3950] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22854 uid=0 result="success" Feb 23 02:14:43 localhost NetworkManager[5974]: [1771830883.4410] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22869 uid=0 result="success" Feb 23 02:14:43 localhost NetworkManager[5974]: [1771830883.4999] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22890 uid=0 result="success" Feb 23 02:14:43 localhost ifup[22891]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:43 localhost ifup[22892]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:43 localhost ifup[22893]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:43 localhost NetworkManager[5974]: [1771830883.5309] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22899 uid=0 result="success" Feb 23 02:14:43 localhost ovs-vsctl[22902]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Feb 23 02:14:43 localhost kernel: device vlan21 entered promiscuous mode Feb 23 02:14:43 localhost systemd-udevd[22904]: Network interface NamePolicy= disabled on kernel command line. Feb 23 02:14:43 localhost NetworkManager[5974]: [1771830883.5725] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Feb 23 02:14:43 localhost NetworkManager[5974]: [1771830883.5953] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22914 uid=0 result="success" Feb 23 02:14:43 localhost NetworkManager[5974]: [1771830883.6160] device (vlan21): carrier: link connected Feb 23 02:14:46 localhost NetworkManager[5974]: [1771830886.6694] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22944 uid=0 result="success" Feb 23 02:14:46 localhost NetworkManager[5974]: [1771830886.7159] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22959 uid=0 result="success" Feb 23 02:14:46 localhost NetworkManager[5974]: [1771830886.7731] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22980 uid=0 result="success" Feb 23 02:14:46 localhost ifup[22981]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:46 localhost ifup[22982]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:46 localhost ifup[22983]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:46 localhost NetworkManager[5974]: [1771830886.8040] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22989 uid=0 result="success" Feb 23 02:14:46 localhost ovs-vsctl[22992]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Feb 23 02:14:46 localhost systemd-udevd[22994]: Network interface NamePolicy= disabled on kernel command line. Feb 23 02:14:46 localhost kernel: device vlan23 entered promiscuous mode Feb 23 02:14:46 localhost NetworkManager[5974]: [1771830886.8437] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Feb 23 02:14:46 localhost NetworkManager[5974]: [1771830886.8678] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23004 uid=0 result="success" Feb 23 02:14:46 localhost NetworkManager[5974]: [1771830886.8878] device (vlan23): carrier: link connected Feb 23 02:14:47 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 23 02:14:49 localhost NetworkManager[5974]: [1771830889.9385] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23035 uid=0 result="success" Feb 23 02:14:49 localhost NetworkManager[5974]: [1771830889.9847] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23050 uid=0 result="success" Feb 23 02:14:50 localhost NetworkManager[5974]: [1771830890.0281] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23071 uid=0 result="success" Feb 23 02:14:50 localhost ifup[23072]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:50 localhost ifup[23073]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:50 localhost ifup[23074]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:50 localhost NetworkManager[5974]: [1771830890.0561] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23080 uid=0 result="success" Feb 23 02:14:50 localhost ovs-vsctl[23083]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Feb 23 02:14:50 localhost systemd-udevd[23085]: Network interface NamePolicy= disabled on kernel command line. Feb 23 02:14:50 localhost kernel: device vlan22 entered promiscuous mode Feb 23 02:14:50 localhost NetworkManager[5974]: [1771830890.0900] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Feb 23 02:14:50 localhost NetworkManager[5974]: [1771830890.1106] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23095 uid=0 result="success" Feb 23 02:14:50 localhost NetworkManager[5974]: [1771830890.1300] device (vlan22): carrier: link connected Feb 23 02:14:53 localhost NetworkManager[5974]: [1771830893.1942] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23125 uid=0 result="success" Feb 23 02:14:53 localhost NetworkManager[5974]: [1771830893.2423] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23140 uid=0 result="success" Feb 23 02:14:53 localhost NetworkManager[5974]: [1771830893.3036] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23161 uid=0 result="success" Feb 23 02:14:53 localhost ifup[23162]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:53 localhost ifup[23163]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:53 localhost ifup[23164]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:53 localhost NetworkManager[5974]: [1771830893.3368] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23170 uid=0 result="success" Feb 23 02:14:53 localhost ovs-vsctl[23173]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Feb 23 02:14:53 localhost systemd-udevd[23175]: Network interface NamePolicy= disabled on kernel command line. Feb 23 02:14:53 localhost kernel: device vlan44 entered promiscuous mode Feb 23 02:14:53 localhost NetworkManager[5974]: [1771830893.3842] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Feb 23 02:14:53 localhost NetworkManager[5974]: [1771830893.4125] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23185 uid=0 result="success" Feb 23 02:14:53 localhost NetworkManager[5974]: [1771830893.4369] device (vlan44): carrier: link connected Feb 23 02:14:56 localhost NetworkManager[5974]: [1771830896.4946] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23215 uid=0 result="success" Feb 23 02:14:56 localhost NetworkManager[5974]: [1771830896.5474] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23230 uid=0 result="success" Feb 23 02:14:56 localhost NetworkManager[5974]: [1771830896.6128] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23251 uid=0 result="success" Feb 23 02:14:56 localhost ifup[23252]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:56 localhost ifup[23253]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:56 localhost ifup[23254]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:56 localhost NetworkManager[5974]: [1771830896.6469] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23260 uid=0 result="success" Feb 23 02:14:56 localhost ovs-vsctl[23263]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Feb 23 02:14:56 localhost NetworkManager[5974]: [1771830896.7126] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23270 uid=0 result="success" Feb 23 02:14:57 localhost NetworkManager[5974]: [1771830897.7811] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23297 uid=0 result="success" Feb 23 02:14:57 localhost NetworkManager[5974]: [1771830897.8281] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23312 uid=0 result="success" Feb 23 02:14:57 localhost NetworkManager[5974]: [1771830897.8897] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23333 uid=0 result="success" Feb 23 02:14:57 localhost ifup[23334]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:57 localhost ifup[23335]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:57 localhost ifup[23336]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:57 localhost NetworkManager[5974]: [1771830897.9231] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23342 uid=0 result="success" Feb 23 02:14:57 localhost ovs-vsctl[23345]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Feb 23 02:14:57 localhost NetworkManager[5974]: [1771830897.9825] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23352 uid=0 result="success" Feb 23 02:14:59 localhost NetworkManager[5974]: [1771830899.0422] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23380 uid=0 result="success" Feb 23 02:14:59 localhost NetworkManager[5974]: [1771830899.0911] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23395 uid=0 result="success" Feb 23 02:14:59 localhost NetworkManager[5974]: [1771830899.1485] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23416 uid=0 result="success" Feb 23 02:14:59 localhost ifup[23417]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:14:59 localhost ifup[23418]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:14:59 localhost ifup[23419]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:14:59 localhost NetworkManager[5974]: [1771830899.1690] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23425 uid=0 result="success" Feb 23 02:14:59 localhost ovs-vsctl[23428]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Feb 23 02:14:59 localhost NetworkManager[5974]: [1771830899.2119] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23435 uid=0 result="success" Feb 23 02:15:00 localhost NetworkManager[5974]: [1771830900.2674] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23463 uid=0 result="success" Feb 23 02:15:00 localhost NetworkManager[5974]: [1771830900.3170] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23478 uid=0 result="success" Feb 23 02:15:00 localhost NetworkManager[5974]: [1771830900.3767] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23499 uid=0 result="success" Feb 23 02:15:00 localhost ifup[23500]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:15:00 localhost ifup[23501]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:15:00 localhost ifup[23502]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:15:00 localhost NetworkManager[5974]: [1771830900.4093] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23508 uid=0 result="success" Feb 23 02:15:00 localhost ovs-vsctl[23511]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Feb 23 02:15:00 localhost NetworkManager[5974]: [1771830900.4687] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23518 uid=0 result="success" Feb 23 02:15:01 localhost NetworkManager[5974]: [1771830901.5263] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23546 uid=0 result="success" Feb 23 02:15:01 localhost NetworkManager[5974]: [1771830901.5717] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23561 uid=0 result="success" Feb 23 02:15:01 localhost NetworkManager[5974]: [1771830901.6304] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23582 uid=0 result="success" Feb 23 02:15:01 localhost ifup[23583]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Feb 23 02:15:01 localhost ifup[23584]: 'network-scripts' will be removed from distribution in near future. Feb 23 02:15:01 localhost ifup[23585]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Feb 23 02:15:01 localhost NetworkManager[5974]: [1771830901.6625] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23591 uid=0 result="success" Feb 23 02:15:01 localhost ovs-vsctl[23594]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Feb 23 02:15:01 localhost NetworkManager[5974]: [1771830901.7221] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23601 uid=0 result="success" Feb 23 02:15:02 localhost NetworkManager[5974]: [1771830902.7855] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23629 uid=0 result="success" Feb 23 02:15:02 localhost NetworkManager[5974]: [1771830902.8342] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23644 uid=0 result="success" Feb 23 02:15:06 localhost sshd[23662]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:15:47 localhost sshd[23665]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:15:55 localhost python3[23681]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-00000000001b-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:16:01 localhost python3[23700]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 02:16:01 localhost python3[23716]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 02:16:02 localhost sshd[23717]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:16:03 localhost python3[23732]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 02:16:03 localhost python3[23748]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Feb 23 02:16:04 localhost python3[23762]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Feb 23 02:16:05 localhost python3[23777]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005626463.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-000000000022-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:16:06 localhost python3[23797]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.localdomain"#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-669c-02d2-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:16:06 localhost systemd[1]: Starting Hostname Service... Feb 23 02:16:06 localhost systemd[1]: Started Hostname Service. Feb 23 02:16:06 localhost systemd-hostnamed[23801]: Hostname set to (static) Feb 23 02:16:06 localhost NetworkManager[5974]: [1771830966.8026] hostname: static hostname changed from "np0005626463.novalocal" to "np0005626463.localdomain" Feb 23 02:16:06 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Feb 23 02:16:06 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Feb 23 02:16:08 localhost systemd[1]: session-10.scope: Deactivated successfully. Feb 23 02:16:08 localhost systemd[1]: session-10.scope: Consumed 1min 42.279s CPU time. Feb 23 02:16:08 localhost systemd-logind[759]: Session 10 logged out. Waiting for processes to exit. Feb 23 02:16:08 localhost systemd-logind[759]: Removed session 10. Feb 23 02:16:10 localhost sshd[23812]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:16:10 localhost systemd[1]: Started Session 11 of User zuul. Feb 23 02:16:10 localhost systemd-logind[759]: New session 11 of user zuul. Feb 23 02:16:11 localhost python3[23829]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Feb 23 02:16:13 localhost systemd[1]: session-11.scope: Deactivated successfully. Feb 23 02:16:13 localhost systemd-logind[759]: Session 11 logged out. Waiting for processes to exit. Feb 23 02:16:13 localhost systemd-logind[759]: Removed session 11. Feb 23 02:16:16 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Feb 23 02:16:36 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 23 02:16:58 localhost sshd[23834]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:16:58 localhost systemd-logind[759]: New session 12 of user zuul. Feb 23 02:16:58 localhost systemd[1]: Started Session 12 of User zuul. Feb 23 02:16:59 localhost python3[23853]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:16:59 localhost sshd[23855]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:17:03 localhost systemd[1]: Reloading. Feb 23 02:17:03 localhost systemd-rc-local-generator[23892]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:17:03 localhost systemd-sysv-generator[23897]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:17:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:17:03 localhost systemd[1]: Starting dnf makecache... Feb 23 02:17:03 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Feb 23 02:17:03 localhost systemd[1]: Reloading. Feb 23 02:17:03 localhost dnf[23910]: Updating Subscription Management repositories. Feb 23 02:17:03 localhost systemd-rc-local-generator[23937]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:17:03 localhost systemd-sysv-generator[23941]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:17:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:17:03 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Feb 23 02:17:03 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Feb 23 02:17:03 localhost systemd[1]: Reloading. Feb 23 02:17:03 localhost systemd-rc-local-generator[23975]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:17:03 localhost systemd-sysv-generator[23978]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:17:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:17:04 localhost systemd[1]: Listening on LVM2 poll daemon socket. Feb 23 02:17:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 02:17:04 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 02:17:04 localhost systemd[1]: Reloading. Feb 23 02:17:04 localhost systemd-rc-local-generator[24022]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:17:04 localhost systemd-sysv-generator[24027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:17:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:17:04 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 02:17:04 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 02:17:04 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 02:17:04 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 02:17:04 localhost systemd[1]: run-rb78154d55188478394a5af81ba94bc52.service: Deactivated successfully. Feb 23 02:17:04 localhost systemd[1]: run-r32bec21cddeb4fa780f7571856fe16fe.service: Deactivated successfully. Feb 23 02:17:05 localhost dnf[23910]: Failed determining last makecache time. Feb 23 02:17:05 localhost dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - High Av 29 kB/s | 4.0 kB 00:00 Feb 23 02:17:05 localhost dnf[23910]: Fast Datapath for RHEL 9 x86_64 (RPMs) 26 kB/s | 4.0 kB 00:00 Feb 23 02:17:05 localhost dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 28 kB/s | 4.1 kB 00:00 Feb 23 02:17:05 localhost dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 30 kB/s | 4.1 kB 00:00 Feb 23 02:17:06 localhost dnf[23910]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_ 30 kB/s | 4.0 kB 00:00 Feb 23 02:17:06 localhost dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 31 kB/s | 4.5 kB 00:00 Feb 23 02:17:06 localhost dnf[23910]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 33 kB/s | 4.5 kB 00:00 Feb 23 02:17:06 localhost dnf[23910]: Metadata cache created. Feb 23 02:17:06 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Feb 23 02:17:06 localhost systemd[1]: Finished dnf makecache. Feb 23 02:17:06 localhost systemd[1]: dnf-makecache.service: Consumed 2.833s CPU time. Feb 23 02:17:40 localhost sshd[24635]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:17:56 localhost sshd[24636]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:18:05 localhost systemd-logind[759]: Session 12 logged out. Waiting for processes to exit. Feb 23 02:18:05 localhost systemd[1]: session-12.scope: Deactivated successfully. Feb 23 02:18:05 localhost systemd[1]: session-12.scope: Consumed 4.768s CPU time. Feb 23 02:18:05 localhost systemd-logind[759]: Removed session 12. Feb 23 02:18:51 localhost sshd[24638]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:19:05 localhost sshd[24640]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:19:42 localhost sshd[24642]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:20:36 localhost sshd[24644]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:21:29 localhost sshd[24646]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:22:13 localhost sshd[24649]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:22:23 localhost sshd[24651]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:24:12 localhost sshd[24653]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:25:17 localhost sshd[24655]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:28:14 localhost sshd[24659]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:31:23 localhost sshd[24663]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:34:09 localhost sshd[24666]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:34:09 localhost systemd-logind[759]: New session 13 of user zuul. Feb 23 02:34:09 localhost systemd[1]: Started Session 13 of User zuul. Feb 23 02:34:09 localhost python3[24714]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 02:34:11 localhost python3[24801]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:34:14 localhost python3[24818]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:34:15 localhost python3[24834]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:34:15 localhost kernel: loop: module loaded Feb 23 02:34:15 localhost kernel: loop3: detected capacity change from 0 to 14680064 Feb 23 02:34:15 localhost python3[24860]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:34:16 localhost lvm[24863]: PV /dev/loop3 not used. Feb 23 02:34:16 localhost lvm[24865]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 23 02:34:16 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0. Feb 23 02:34:16 localhost lvm[24870]: 1 logical volume(s) in volume group "ceph_vg0" now active Feb 23 02:34:16 localhost lvm[24875]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 23 02:34:16 localhost lvm[24875]: VG ceph_vg0 finished Feb 23 02:34:16 localhost systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully. Feb 23 02:34:16 localhost python3[24923]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:34:17 localhost python3[24966]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771832056.4736516-55332-185809912784296/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:34:18 localhost python3[24996]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:34:19 localhost systemd[1]: Reloading. Feb 23 02:34:19 localhost systemd-sysv-generator[25029]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:34:19 localhost systemd-rc-local-generator[25022]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:34:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:34:19 localhost systemd[1]: Starting Ceph OSD losetup... Feb 23 02:34:19 localhost bash[25038]: /dev/loop3: [64516]:8399529 (/var/lib/ceph-osd-0.img) Feb 23 02:34:19 localhost systemd[1]: Finished Ceph OSD losetup. Feb 23 02:34:19 localhost lvm[25040]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 23 02:34:19 localhost lvm[25040]: VG ceph_vg0 finished Feb 23 02:34:20 localhost python3[25056]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:34:22 localhost python3[25073]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:34:23 localhost python3[25089]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:34:23 localhost kernel: loop4: detected capacity change from 0 to 14680064 Feb 23 02:34:23 localhost python3[25111]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:34:24 localhost lvm[25114]: PV /dev/loop4 not used. Feb 23 02:34:24 localhost lvm[25116]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 23 02:34:24 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1. Feb 23 02:34:24 localhost lvm[25126]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 23 02:34:24 localhost lvm[25126]: VG ceph_vg1 finished Feb 23 02:34:24 localhost lvm[25127]: 1 logical volume(s) in volume group "ceph_vg1" now active Feb 23 02:34:24 localhost systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully. Feb 23 02:34:24 localhost python3[25175]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:34:25 localhost python3[25218]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771832064.5415137-55436-245063323876259/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:34:25 localhost python3[25248]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:34:26 localhost systemd[1]: Reloading. Feb 23 02:34:26 localhost systemd-sysv-generator[25281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:34:26 localhost systemd-rc-local-generator[25276]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:34:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:34:27 localhost systemd[1]: Starting Ceph OSD losetup... Feb 23 02:34:27 localhost bash[25289]: /dev/loop4: [64516]:9169183 (/var/lib/ceph-osd-1.img) Feb 23 02:34:27 localhost systemd[1]: Finished Ceph OSD losetup. Feb 23 02:34:27 localhost lvm[25290]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 23 02:34:27 localhost lvm[25290]: VG ceph_vg1 finished Feb 23 02:34:30 localhost sshd[25293]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:34:35 localhost sshd[25338]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:34:35 localhost python3[25340]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Feb 23 02:34:37 localhost python3[25361]: ansible-hostname Invoked with name=np0005626463.localdomain use=None Feb 23 02:34:37 localhost systemd[1]: Starting Hostname Service... Feb 23 02:34:37 localhost systemd[1]: Started Hostname Service. Feb 23 02:34:39 localhost python3[25384]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Feb 23 02:34:40 localhost python3[25432]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.eefre24ztmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:34:40 localhost python3[25462]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.eefre24ztmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:34:41 localhost python3[25478]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.eefre24ztmphosts insertbefore=BOF block=192.168.122.106 np0005626463.localdomain np0005626463#012192.168.122.106 np0005626463.ctlplane.localdomain np0005626463.ctlplane#012192.168.122.107 np0005626465.localdomain np0005626465#012192.168.122.107 np0005626465.ctlplane.localdomain np0005626465.ctlplane#012192.168.122.108 np0005626466.localdomain np0005626466#012192.168.122.108 np0005626466.ctlplane.localdomain np0005626466.ctlplane#012192.168.122.103 np0005626459.localdomain np0005626459#012192.168.122.103 np0005626459.ctlplane.localdomain np0005626459.ctlplane#012192.168.122.104 np0005626460.localdomain np0005626460#012192.168.122.104 np0005626460.ctlplane.localdomain np0005626460.ctlplane#012192.168.122.105 np0005626461.localdomain np0005626461#012192.168.122.105 np0005626461.ctlplane.localdomain np0005626461.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:34:41 localhost python3[25494]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.eefre24ztmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:34:42 localhost python3[25511]: ansible-file Invoked with path=/tmp/ansible.eefre24ztmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:34:44 localhost python3[25527]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:34:45 localhost python3[25545]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:34:49 localhost python3[25594]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:34:49 localhost python3[25639]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771832088.7710097-56351-32313271728406/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:34:51 localhost python3[25669]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:34:52 localhost python3[25687]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 02:34:52 localhost chronyd[765]: chronyd exiting Feb 23 02:34:52 localhost systemd[1]: Stopping NTP client/server... Feb 23 02:34:52 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 23 02:34:52 localhost systemd[1]: Stopped NTP client/server. Feb 23 02:34:52 localhost systemd[1]: chronyd.service: Consumed 110ms CPU time, read 1.9M from disk, written 0B to disk. Feb 23 02:34:52 localhost systemd[1]: Starting NTP client/server... Feb 23 02:34:52 localhost chronyd[25695]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 23 02:34:52 localhost chronyd[25695]: Frequency -30.767 +/- 0.158 ppm read from /var/lib/chrony/drift Feb 23 02:34:52 localhost chronyd[25695]: Loaded seccomp filter (level 2) Feb 23 02:34:52 localhost systemd[1]: Started NTP client/server. Feb 23 02:34:54 localhost python3[25744]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:34:55 localhost python3[25787]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771832094.2500186-56514-214745890760840/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:34:55 localhost python3[25817]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:34:55 localhost systemd[1]: Reloading. Feb 23 02:34:55 localhost systemd-rc-local-generator[25840]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:34:55 localhost systemd-sysv-generator[25845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:34:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:34:55 localhost systemd[1]: Reloading. Feb 23 02:34:56 localhost systemd-rc-local-generator[25885]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:34:56 localhost systemd-sysv-generator[25888]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:34:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:34:56 localhost systemd[1]: Starting chronyd online sources service... Feb 23 02:34:56 localhost chronyc[25894]: 200 OK Feb 23 02:34:56 localhost systemd[1]: chrony-online.service: Deactivated successfully. Feb 23 02:34:56 localhost systemd[1]: Finished chronyd online sources service. Feb 23 02:34:57 localhost python3[25910]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:34:57 localhost chronyd[25695]: System clock was stepped by 0.000000 seconds Feb 23 02:34:57 localhost python3[25927]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:34:57 localhost chronyd[25695]: Selected source 167.160.187.179 (pool.ntp.org) Feb 23 02:35:07 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Feb 23 02:35:08 localhost python3[25947]: ansible-timezone Invoked with name=UTC hwclock=None Feb 23 02:35:08 localhost systemd[1]: Starting Time & Date Service... Feb 23 02:35:08 localhost systemd[1]: Started Time & Date Service. Feb 23 02:35:09 localhost python3[25967]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 02:35:09 localhost chronyd[25695]: chronyd exiting Feb 23 02:35:09 localhost systemd[1]: Stopping NTP client/server... Feb 23 02:35:09 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 23 02:35:09 localhost systemd[1]: Stopped NTP client/server. Feb 23 02:35:09 localhost systemd[1]: Starting NTP client/server... Feb 23 02:35:09 localhost chronyd[25974]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 23 02:35:09 localhost chronyd[25974]: Frequency -30.767 +/- 0.170 ppm read from /var/lib/chrony/drift Feb 23 02:35:09 localhost chronyd[25974]: Loaded seccomp filter (level 2) Feb 23 02:35:09 localhost systemd[1]: Started NTP client/server. Feb 23 02:35:13 localhost chronyd[25974]: Selected source 167.160.187.179 (pool.ntp.org) Feb 23 02:35:28 localhost sshd[26169]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:35:38 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 23 02:36:51 localhost sshd[26173]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:03 localhost sshd[26175]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:03 localhost systemd-logind[759]: New session 14 of user ceph-admin. Feb 23 02:37:03 localhost systemd[1]: Created slice User Slice of UID 1002. Feb 23 02:37:03 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Feb 23 02:37:03 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Feb 23 02:37:03 localhost systemd[1]: Starting User Manager for UID 1002... Feb 23 02:37:03 localhost sshd[26192]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:03 localhost systemd[26179]: Queued start job for default target Main User Target. Feb 23 02:37:03 localhost systemd[26179]: Created slice User Application Slice. Feb 23 02:37:03 localhost systemd[26179]: Started Mark boot as successful after the user session has run 2 minutes. Feb 23 02:37:03 localhost systemd[26179]: Started Daily Cleanup of User's Temporary Directories. Feb 23 02:37:03 localhost systemd[26179]: Reached target Paths. Feb 23 02:37:03 localhost systemd[26179]: Reached target Timers. Feb 23 02:37:03 localhost systemd[26179]: Starting D-Bus User Message Bus Socket... Feb 23 02:37:03 localhost systemd[26179]: Starting Create User's Volatile Files and Directories... Feb 23 02:37:03 localhost systemd[26179]: Listening on D-Bus User Message Bus Socket. Feb 23 02:37:03 localhost systemd[26179]: Reached target Sockets. Feb 23 02:37:03 localhost systemd[26179]: Finished Create User's Volatile Files and Directories. Feb 23 02:37:03 localhost systemd[26179]: Reached target Basic System. Feb 23 02:37:03 localhost systemd[26179]: Reached target Main User Target. Feb 23 02:37:03 localhost systemd[26179]: Startup finished in 117ms. Feb 23 02:37:03 localhost systemd[1]: Started User Manager for UID 1002. Feb 23 02:37:03 localhost systemd[1]: Started Session 14 of User ceph-admin. Feb 23 02:37:03 localhost systemd-logind[759]: New session 16 of user ceph-admin. Feb 23 02:37:03 localhost systemd[1]: Started Session 16 of User ceph-admin. Feb 23 02:37:04 localhost sshd[26214]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:04 localhost systemd-logind[759]: New session 17 of user ceph-admin. Feb 23 02:37:04 localhost systemd[1]: Started Session 17 of User ceph-admin. Feb 23 02:37:04 localhost sshd[26233]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:04 localhost systemd-logind[759]: New session 18 of user ceph-admin. Feb 23 02:37:04 localhost systemd[1]: Started Session 18 of User ceph-admin. Feb 23 02:37:05 localhost sshd[26252]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:05 localhost systemd-logind[759]: New session 19 of user ceph-admin. Feb 23 02:37:05 localhost systemd[1]: Started Session 19 of User ceph-admin. Feb 23 02:37:05 localhost sshd[26271]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:05 localhost systemd-logind[759]: New session 20 of user ceph-admin. Feb 23 02:37:05 localhost systemd[1]: Started Session 20 of User ceph-admin. Feb 23 02:37:05 localhost sshd[26290]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:05 localhost systemd-logind[759]: New session 21 of user ceph-admin. Feb 23 02:37:05 localhost systemd[1]: Started Session 21 of User ceph-admin. Feb 23 02:37:06 localhost sshd[26309]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:06 localhost systemd-logind[759]: New session 22 of user ceph-admin. Feb 23 02:37:06 localhost systemd[1]: Started Session 22 of User ceph-admin. Feb 23 02:37:06 localhost sshd[26328]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:06 localhost systemd-logind[759]: New session 23 of user ceph-admin. Feb 23 02:37:06 localhost systemd[1]: Started Session 23 of User ceph-admin. Feb 23 02:37:06 localhost sshd[26347]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:07 localhost systemd-logind[759]: New session 24 of user ceph-admin. Feb 23 02:37:07 localhost systemd[1]: Started Session 24 of User ceph-admin. Feb 23 02:37:07 localhost sshd[26364]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:07 localhost systemd-logind[759]: New session 25 of user ceph-admin. Feb 23 02:37:07 localhost systemd[1]: Started Session 25 of User ceph-admin. Feb 23 02:37:07 localhost sshd[26383]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:08 localhost systemd-logind[759]: New session 26 of user ceph-admin. Feb 23 02:37:08 localhost systemd[1]: Started Session 26 of User ceph-admin. Feb 23 02:37:08 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:37:39 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:37:40 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:37:40 localhost sshd[26587]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:40 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:37:40 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:37:40 localhost systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26601 (sysctl) Feb 23 02:37:40 localhost systemd[1]: Mounting Arbitrary Executable File Formats File System... Feb 23 02:37:40 localhost systemd[1]: Mounted Arbitrary Executable File Formats File System. Feb 23 02:37:41 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:37:42 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:37:44 localhost sshd[26754]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:37:47 localhost kernel: VFS: idmapped mount is not enabled. Feb 23 02:38:11 localhost podman[26739]: Feb 23 02:38:11 localhost podman[26739]: 2026-02-23 07:38:11.901544641 +0000 UTC m=+29.685957491 container create 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, GIT_BRANCH=main, name=rhceph, release=1770267347, build-date=2026-02-09T10:25:24Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Feb 23 02:38:11 localhost podman[26739]: 2026-02-23 07:37:42.259044241 +0000 UTC m=+0.043457091 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:11 localhost systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2997277124-merged.mount: Deactivated successfully. Feb 23 02:38:11 localhost systemd[1]: Created slice Slice /machine. Feb 23 02:38:11 localhost systemd[1]: Started libpod-conmon-8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388.scope. Feb 23 02:38:11 localhost systemd[1]: Started libcrun container. Feb 23 02:38:12 localhost podman[26739]: 2026-02-23 07:38:12.015487488 +0000 UTC m=+29.799900328 container init 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux , version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, release=1770267347, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 02:38:12 localhost podman[26739]: 2026-02-23 07:38:12.025035648 +0000 UTC m=+29.809448498 container start 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux , vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, ceph=True, release=1770267347, RELEASE=main) Feb 23 02:38:12 localhost podman[26739]: 2026-02-23 07:38:12.025303777 +0000 UTC m=+29.809716617 container attach 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 02:38:12 localhost nifty_grothendieck[27003]: 167 167 Feb 23 02:38:12 localhost systemd[1]: libpod-8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388.scope: Deactivated successfully. Feb 23 02:38:12 localhost podman[26739]: 2026-02-23 07:38:12.029406275 +0000 UTC m=+29.813819145 container died 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, version=7) Feb 23 02:38:12 localhost podman[27008]: 2026-02-23 07:38:12.105587624 +0000 UTC m=+0.067433226 container remove 8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_grothendieck, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux ) Feb 23 02:38:12 localhost systemd[1]: libpod-conmon-8f998872d27d03833ca99a5745743a8bc54420c1993e9de2a808878626c4e388.scope: Deactivated successfully. Feb 23 02:38:12 localhost podman[27028]: Feb 23 02:38:12 localhost podman[27028]: 2026-02-23 07:38:12.339531111 +0000 UTC m=+0.080969601 container create 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, build-date=2026-02-09T10:25:24Z, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, GIT_BRANCH=main) Feb 23 02:38:12 localhost systemd[1]: Started libpod-conmon-1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde.scope. Feb 23 02:38:12 localhost systemd[1]: Started libcrun container. Feb 23 02:38:12 localhost podman[27028]: 2026-02-23 07:38:12.300024774 +0000 UTC m=+0.041463274 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a33ff749735729ecbdef763b7fb50eadabab15c6924fd681e80a2b1c973bd42/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a33ff749735729ecbdef763b7fb50eadabab15c6924fd681e80a2b1c973bd42/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:12 localhost podman[27028]: 2026-02-23 07:38:12.423692597 +0000 UTC m=+0.165131097 container init 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.buildah.version=1.42.2, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True) Feb 23 02:38:12 localhost podman[27028]: 2026-02-23 07:38:12.432387919 +0000 UTC m=+0.173826419 container start 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 02:38:12 localhost podman[27028]: 2026-02-23 07:38:12.432649008 +0000 UTC m=+0.174087508 container attach 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc.) Feb 23 02:38:12 localhost systemd[1]: var-lib-containers-storage-overlay-fb7a309750f49f2c9aff72314ba3c61d17c899c3c3b2d9a9fecc7124a146141a-merged.mount: Deactivated successfully. Feb 23 02:38:13 localhost angry_dirac[27043]: [ Feb 23 02:38:13 localhost angry_dirac[27043]: { Feb 23 02:38:13 localhost angry_dirac[27043]: "available": false, Feb 23 02:38:13 localhost angry_dirac[27043]: "ceph_device": false, Feb 23 02:38:13 localhost angry_dirac[27043]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 23 02:38:13 localhost angry_dirac[27043]: "lsm_data": {}, Feb 23 02:38:13 localhost angry_dirac[27043]: "lvs": [], Feb 23 02:38:13 localhost angry_dirac[27043]: "path": "/dev/sr0", Feb 23 02:38:13 localhost angry_dirac[27043]: "rejected_reasons": [ Feb 23 02:38:13 localhost angry_dirac[27043]: "Insufficient space (<5GB)", Feb 23 02:38:13 localhost angry_dirac[27043]: "Has a FileSystem" Feb 23 02:38:13 localhost angry_dirac[27043]: ], Feb 23 02:38:13 localhost angry_dirac[27043]: "sys_api": { Feb 23 02:38:13 localhost angry_dirac[27043]: "actuators": null, Feb 23 02:38:13 localhost angry_dirac[27043]: "device_nodes": "sr0", Feb 23 02:38:13 localhost angry_dirac[27043]: "human_readable_size": "482.00 KB", Feb 23 02:38:13 localhost angry_dirac[27043]: "id_bus": "ata", Feb 23 02:38:13 localhost angry_dirac[27043]: "model": "QEMU DVD-ROM", Feb 23 02:38:13 localhost angry_dirac[27043]: "nr_requests": "2", Feb 23 02:38:13 localhost angry_dirac[27043]: "partitions": {}, Feb 23 02:38:13 localhost angry_dirac[27043]: "path": "/dev/sr0", Feb 23 02:38:13 localhost angry_dirac[27043]: "removable": "1", Feb 23 02:38:13 localhost angry_dirac[27043]: "rev": "2.5+", Feb 23 02:38:13 localhost angry_dirac[27043]: "ro": "0", Feb 23 02:38:13 localhost angry_dirac[27043]: "rotational": "1", Feb 23 02:38:13 localhost angry_dirac[27043]: "sas_address": "", Feb 23 02:38:13 localhost angry_dirac[27043]: "sas_device_handle": "", Feb 23 02:38:13 localhost angry_dirac[27043]: "scheduler_mode": "mq-deadline", Feb 23 02:38:13 localhost angry_dirac[27043]: "sectors": 0, Feb 23 02:38:13 localhost angry_dirac[27043]: "sectorsize": "2048", Feb 23 02:38:13 localhost angry_dirac[27043]: "size": 493568.0, Feb 23 02:38:13 localhost angry_dirac[27043]: "support_discard": "0", Feb 23 02:38:13 localhost angry_dirac[27043]: "type": "disk", Feb 23 02:38:13 localhost angry_dirac[27043]: "vendor": "QEMU" Feb 23 02:38:13 localhost angry_dirac[27043]: } Feb 23 02:38:13 localhost angry_dirac[27043]: } Feb 23 02:38:13 localhost angry_dirac[27043]: ] Feb 23 02:38:13 localhost systemd[1]: libpod-1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde.scope: Deactivated successfully. Feb 23 02:38:13 localhost podman[27028]: 2026-02-23 07:38:13.197656689 +0000 UTC m=+0.939095239 container died 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, RELEASE=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1770267347, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-type=git, io.openshift.tags=rhceph ceph) Feb 23 02:38:13 localhost systemd[1]: var-lib-containers-storage-overlay-5a33ff749735729ecbdef763b7fb50eadabab15c6924fd681e80a2b1c973bd42-merged.mount: Deactivated successfully. Feb 23 02:38:13 localhost podman[28412]: 2026-02-23 07:38:13.287119474 +0000 UTC m=+0.075235118 container remove 1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_dirac, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, version=7, distribution-scope=public, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , ceph=True, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git) Feb 23 02:38:13 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:38:13 localhost systemd[1]: libpod-conmon-1af4f24cd1021bf6174e7978dbe57e2c0a33842ea874cd3715ffb89dfa4d4dde.scope: Deactivated successfully. Feb 23 02:38:13 localhost systemd[1]: systemd-coredump.socket: Deactivated successfully. Feb 23 02:38:13 localhost systemd[1]: Closed Process Core Dump Socket. Feb 23 02:38:13 localhost systemd[1]: Stopping Process Core Dump Socket... Feb 23 02:38:13 localhost systemd[1]: Listening on Process Core Dump Socket. Feb 23 02:38:13 localhost systemd[1]: Reloading. Feb 23 02:38:13 localhost systemd-sysv-generator[28498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:13 localhost systemd-rc-local-generator[28492]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:14 localhost systemd[1]: Reloading. Feb 23 02:38:14 localhost systemd-rc-local-generator[28530]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:14 localhost systemd-sysv-generator[28534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:31 localhost sshd[28542]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:38:34 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:38:34 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:38:34 localhost podman[28616]: Feb 23 02:38:34 localhost podman[28616]: 2026-02-23 07:38:34.517499148 +0000 UTC m=+0.067506972 container create 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, maintainer=Guillaume Abrioux , ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, RELEASE=main, name=rhceph) Feb 23 02:38:34 localhost systemd[1]: Started libpod-conmon-98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0.scope. Feb 23 02:38:34 localhost systemd[1]: Started libcrun container. Feb 23 02:38:34 localhost podman[28616]: 2026-02-23 07:38:34.588078132 +0000 UTC m=+0.138085976 container init 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, architecture=x86_64, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, distribution-scope=public, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 02:38:34 localhost podman[28616]: 2026-02-23 07:38:34.495988971 +0000 UTC m=+0.045996805 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:34 localhost podman[28616]: 2026-02-23 07:38:34.598645065 +0000 UTC m=+0.148652909 container start 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, architecture=x86_64, RELEASE=main, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Feb 23 02:38:34 localhost podman[28616]: 2026-02-23 07:38:34.59892691 +0000 UTC m=+0.148934764 container attach 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.42.2, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, distribution-scope=public, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, ceph=True) Feb 23 02:38:34 localhost tender_ardinghelli[28632]: 167 167 Feb 23 02:38:34 localhost systemd[1]: libpod-98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0.scope: Deactivated successfully. Feb 23 02:38:34 localhost podman[28616]: 2026-02-23 07:38:34.602810284 +0000 UTC m=+0.152818158 container died 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , version=7, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Feb 23 02:38:34 localhost podman[28637]: 2026-02-23 07:38:34.700445296 +0000 UTC m=+0.080817891 container remove 98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_ardinghelli, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, vcs-type=git, RELEASE=main) Feb 23 02:38:34 localhost systemd[1]: libpod-conmon-98d363638c91aa57109e86c0b3f714e363f8ab8eb9a3414d970068a9700e18e0.scope: Deactivated successfully. Feb 23 02:38:34 localhost systemd[1]: Reloading. Feb 23 02:38:34 localhost systemd-rc-local-generator[28676]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:34 localhost systemd-sysv-generator[28681]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:35 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:38:35 localhost systemd[1]: Reloading. Feb 23 02:38:35 localhost systemd-sysv-generator[28718]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:35 localhost systemd-rc-local-generator[28713]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:35 localhost systemd[1]: Reached target All Ceph clusters and services. Feb 23 02:38:35 localhost systemd[1]: Reloading. Feb 23 02:38:35 localhost systemd-rc-local-generator[28751]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:35 localhost systemd-sysv-generator[28756]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:35 localhost systemd[1]: Reached target Ceph cluster f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 02:38:35 localhost systemd[1]: Reloading. Feb 23 02:38:35 localhost systemd-sysv-generator[28792]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:35 localhost systemd-rc-local-generator[28789]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:35 localhost systemd[1]: Reloading. Feb 23 02:38:35 localhost systemd-rc-local-generator[28830]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:35 localhost systemd-sysv-generator[28835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:36 localhost systemd[1]: Created slice Slice /system/ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 02:38:36 localhost systemd[1]: Reached target System Time Set. Feb 23 02:38:36 localhost systemd[1]: Reached target System Time Synchronized. Feb 23 02:38:36 localhost systemd[1]: Starting Ceph crash.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46... Feb 23 02:38:36 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:38:36 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:38:36 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Feb 23 02:38:36 localhost podman[28893]: Feb 23 02:38:36 localhost podman[28893]: 2026-02-23 07:38:36.371389811 +0000 UTC m=+0.078063116 container create fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_BRANCH=main) Feb 23 02:38:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/808d93a0a95d4410bf1c501ca81cea82f0256a147cb0c664b66924089e99de68/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/808d93a0a95d4410bf1c501ca81cea82f0256a147cb0c664b66924089e99de68/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:36 localhost podman[28893]: 2026-02-23 07:38:36.340234317 +0000 UTC m=+0.046907632 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/808d93a0a95d4410bf1c501ca81cea82f0256a147cb0c664b66924089e99de68/merged/etc/ceph/ceph.client.crash.np0005626463.keyring supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:36 localhost podman[28893]: 2026-02-23 07:38:36.462938394 +0000 UTC m=+0.169611699 container init fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, ceph=True, vcs-type=git, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.42.2, GIT_CLEAN=True, name=rhceph, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 02:38:36 localhost podman[28893]: 2026-02-23 07:38:36.486172242 +0000 UTC m=+0.192845557 container start fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git) Feb 23 02:38:36 localhost bash[28893]: fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 Feb 23 02:38:36 localhost systemd[1]: Started Ceph crash.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 02:38:36 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: INFO:ceph-crash:pinging cluster to exercise our key, trying key client.crash.np0005626463. Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: cluster: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: id: f1fea371-cb69-578d-a3d0-b5c472a84b46 Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: health: HEALTH_WARN Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: OSD count 0 < osd_pool_default_size 3 Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: services: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: mon: 3 daemons, quorum np0005626459,np0005626461,np0005626460 (age 5s) Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: mgr: np0005626459.pmtxxl(active, since 2m), standbys: np0005626461.lrfquh Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: osd: 0 osds: 0 up, 0 in Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: data: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: pools: 0 pools, 0 pgs Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: objects: 0 objects, 0 B Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: usage: 0 B used, 0 B / 0 B avail Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: pgs: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: progress: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: Updating crash deployment (+4 -> 6) (0s) Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: [............................] Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: Feb 23 02:38:37 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463[28907]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s Feb 23 02:38:44 localhost podman[29002]: Feb 23 02:38:44 localhost podman[29002]: 2026-02-23 07:38:44.841987415 +0000 UTC m=+0.073625934 container create 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, version=7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 02:38:44 localhost systemd[1]: Started libpod-conmon-1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c.scope. Feb 23 02:38:44 localhost systemd[1]: Started libcrun container. Feb 23 02:38:44 localhost podman[29002]: 2026-02-23 07:38:44.811405421 +0000 UTC m=+0.043043940 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:44 localhost podman[29002]: 2026-02-23 07:38:44.916675262 +0000 UTC m=+0.148313791 container init 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, version=7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, release=1770267347, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=) Feb 23 02:38:44 localhost podman[29002]: 2026-02-23 07:38:44.926579622 +0000 UTC m=+0.158218161 container start 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Feb 23 02:38:44 localhost podman[29002]: 2026-02-23 07:38:44.92691117 +0000 UTC m=+0.158549709 container attach 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1770267347, architecture=x86_64, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 02:38:44 localhost youthful_engelbart[29017]: 167 167 Feb 23 02:38:44 localhost systemd[1]: libpod-1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c.scope: Deactivated successfully. Feb 23 02:38:44 localhost podman[29002]: 2026-02-23 07:38:44.929218831 +0000 UTC m=+0.160857350 container died 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, vcs-type=git, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_CLEAN=True, release=1770267347, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph) Feb 23 02:38:45 localhost systemd[1]: var-lib-containers-storage-overlay-cef3fbdac90d616fc0d7c3545aa1aa61742dbae67db60e53c99ac07b9835c7c9-merged.mount: Deactivated successfully. Feb 23 02:38:45 localhost podman[29022]: 2026-02-23 07:38:45.016499589 +0000 UTC m=+0.074149470 container remove 1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_engelbart, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.42.2, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7) Feb 23 02:38:45 localhost systemd[1]: libpod-conmon-1200a2a161681097adfee2d2e6c3337792273985f4f3bfd3c8148e587a280c5c.scope: Deactivated successfully. Feb 23 02:38:45 localhost podman[29044]: Feb 23 02:38:45 localhost podman[29044]: 2026-02-23 07:38:45.236648938 +0000 UTC m=+0.072707935 container create 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph) Feb 23 02:38:45 localhost systemd[1]: Started libpod-conmon-215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb.scope. Feb 23 02:38:45 localhost systemd[1]: Started libcrun container. Feb 23 02:38:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:45 localhost podman[29044]: 2026-02-23 07:38:45.206657895 +0000 UTC m=+0.042716882 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:45 localhost podman[29044]: 2026-02-23 07:38:45.362070847 +0000 UTC m=+0.198129844 container init 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph) Feb 23 02:38:45 localhost podman[29044]: 2026-02-23 07:38:45.37299019 +0000 UTC m=+0.209049187 container start 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.42.2, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, release=1770267347, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main) Feb 23 02:38:45 localhost podman[29044]: 2026-02-23 07:38:45.373257574 +0000 UTC m=+0.209316611 container attach 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public) Feb 23 02:38:45 localhost relaxed_newton[29060]: --> passed data devices: 0 physical, 2 LVM Feb 23 02:38:45 localhost relaxed_newton[29060]: --> relative data size: 1.0 Feb 23 02:38:45 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 23 02:38:45 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 3c38c3a7-5c4b-4b97-99e3-119e348f6df6 Feb 23 02:38:46 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 23 02:38:46 localhost lvm[29114]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 23 02:38:46 localhost lvm[29114]: VG ceph_vg0 finished Feb 23 02:38:46 localhost relaxed_newton[29060]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2 Feb 23 02:38:46 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0 Feb 23 02:38:46 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 23 02:38:46 localhost relaxed_newton[29060]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block Feb 23 02:38:46 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap Feb 23 02:38:46 localhost relaxed_newton[29060]: stderr: got monmap epoch 3 Feb 23 02:38:46 localhost relaxed_newton[29060]: --> Creating keyring file for osd.2 Feb 23 02:38:46 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring Feb 23 02:38:47 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/ Feb 23 02:38:47 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 3c38c3a7-5c4b-4b97-99e3-119e348f6df6 --setuser ceph --setgroup ceph Feb 23 02:38:49 localhost relaxed_newton[29060]: stderr: 2026-02-23T07:38:47.061+0000 7f6e37b08a80 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Feb 23 02:38:49 localhost relaxed_newton[29060]: stderr: 2026-02-23T07:38:47.061+0000 7f6e37b08a80 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid Feb 23 02:38:49 localhost relaxed_newton[29060]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0 Feb 23 02:38:49 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Feb 23 02:38:49 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config Feb 23 02:38:49 localhost relaxed_newton[29060]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block Feb 23 02:38:49 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block Feb 23 02:38:49 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 23 02:38:49 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Feb 23 02:38:49 localhost relaxed_newton[29060]: --> ceph-volume lvm activate successful for osd ID: 2 Feb 23 02:38:49 localhost relaxed_newton[29060]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0 Feb 23 02:38:49 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 23 02:38:49 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 79650a5e-2685-4848-a7c4-7cead1e09ea1 Feb 23 02:38:50 localhost lvm[30046]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph-authtool --gen-print-key Feb 23 02:38:50 localhost lvm[30046]: VG ceph_vg1 finished Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-5 Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1 Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-5/activate.monmap Feb 23 02:38:50 localhost relaxed_newton[29060]: stderr: got monmap epoch 3 Feb 23 02:38:50 localhost relaxed_newton[29060]: --> Creating keyring file for osd.5 Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/keyring Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/ Feb 23 02:38:50 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 5 --monmap /var/lib/ceph/osd/ceph-5/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-5/ --osd-uuid 79650a5e-2685-4848-a7c4-7cead1e09ea1 --setuser ceph --setgroup ceph Feb 23 02:38:53 localhost relaxed_newton[29060]: stderr: 2026-02-23T07:38:50.824+0000 7f44dd1baa80 -1 bluestore(/var/lib/ceph/osd/ceph-5//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Feb 23 02:38:53 localhost relaxed_newton[29060]: stderr: 2026-02-23T07:38:50.824+0000 7f44dd1baa80 -1 bluestore(/var/lib/ceph/osd/ceph-5/) _read_fsid unparsable uuid Feb 23 02:38:53 localhost relaxed_newton[29060]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1 Feb 23 02:38:53 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Feb 23 02:38:53 localhost relaxed_newton[29060]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-5 --no-mon-config Feb 23 02:38:53 localhost relaxed_newton[29060]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block Feb 23 02:38:53 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block Feb 23 02:38:53 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 23 02:38:53 localhost relaxed_newton[29060]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Feb 23 02:38:53 localhost relaxed_newton[29060]: --> ceph-volume lvm activate successful for osd ID: 5 Feb 23 02:38:53 localhost relaxed_newton[29060]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1 Feb 23 02:38:53 localhost systemd[1]: libpod-215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb.scope: Deactivated successfully. Feb 23 02:38:53 localhost systemd[1]: libpod-215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb.scope: Consumed 3.706s CPU time. Feb 23 02:38:53 localhost podman[30944]: 2026-02-23 07:38:53.490366165 +0000 UTC m=+0.049480497 container died 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 02:38:53 localhost systemd[1]: tmp-crun.caYVru.mount: Deactivated successfully. Feb 23 02:38:53 localhost systemd[1]: var-lib-containers-storage-overlay-ea140969b302d3c2ad5fdfb4390c95d64337f400c1fa01e354c1e83ab8644fb3-merged.mount: Deactivated successfully. Feb 23 02:38:53 localhost podman[30944]: 2026-02-23 07:38:53.529956612 +0000 UTC m=+0.089070904 container remove 215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_newton, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, ceph=True, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc.) Feb 23 02:38:53 localhost systemd[1]: libpod-conmon-215492e48e1299c27dbf8ddaa82dff00c684596d495de2e7dff14262cb6267cb.scope: Deactivated successfully. Feb 23 02:38:54 localhost podman[31025]: Feb 23 02:38:54 localhost podman[31025]: 2026-02-23 07:38:54.2707077 +0000 UTC m=+0.069032852 container create 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1770267347, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, version=7) Feb 23 02:38:54 localhost systemd[1]: Started libpod-conmon-1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a.scope. Feb 23 02:38:54 localhost systemd[1]: Started libcrun container. Feb 23 02:38:54 localhost podman[31025]: 2026-02-23 07:38:54.335438655 +0000 UTC m=+0.133763807 container init 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_CLEAN=True, vcs-type=git, release=1770267347, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.42.2) Feb 23 02:38:54 localhost podman[31025]: 2026-02-23 07:38:54.242757464 +0000 UTC m=+0.041082616 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:54 localhost podman[31025]: 2026-02-23 07:38:54.344000926 +0000 UTC m=+0.142326078 container start 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, ceph=True, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vcs-type=git) Feb 23 02:38:54 localhost podman[31025]: 2026-02-23 07:38:54.344247318 +0000 UTC m=+0.142572470 container attach 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 23 02:38:54 localhost quizzical_chatelet[31041]: 167 167 Feb 23 02:38:54 localhost systemd[1]: libpod-1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a.scope: Deactivated successfully. Feb 23 02:38:54 localhost podman[31025]: 2026-02-23 07:38:54.347770503 +0000 UTC m=+0.146095685 container died 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, GIT_CLEAN=True, release=1770267347, RELEASE=main, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, version=7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container) Feb 23 02:38:54 localhost podman[31046]: 2026-02-23 07:38:54.430979978 +0000 UTC m=+0.071929104 container remove 1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_chatelet, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64) Feb 23 02:38:54 localhost systemd[1]: libpod-conmon-1edf31218dfc798eb52a2c69add5ca018cfe2dc2a3e4bd4269f445de0b0ac93a.scope: Deactivated successfully. Feb 23 02:38:54 localhost systemd[1]: var-lib-containers-storage-overlay-7b9d5611a60fac7d0fb8943e20608c63a05668d286a9d63cf33afc8a831c9bc1-merged.mount: Deactivated successfully. Feb 23 02:38:54 localhost podman[31065]: Feb 23 02:38:54 localhost podman[31065]: 2026-02-23 07:38:54.610715747 +0000 UTC m=+0.041389143 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:55 localhost podman[31065]: 2026-02-23 07:38:55.075754252 +0000 UTC m=+0.506427648 container create f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, vcs-type=git, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public) Feb 23 02:38:55 localhost systemd[1]: Started libpod-conmon-f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1.scope. Feb 23 02:38:55 localhost systemd[1]: Started libcrun container. Feb 23 02:38:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e7bc5eefd4eabb95507f861615322272d0998c6323fea80fee47b37ac39fb9b/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e7bc5eefd4eabb95507f861615322272d0998c6323fea80fee47b37ac39fb9b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e7bc5eefd4eabb95507f861615322272d0998c6323fea80fee47b37ac39fb9b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:55 localhost podman[31065]: 2026-02-23 07:38:55.172839724 +0000 UTC m=+0.603513120 container init f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, io.openshift.expose-services=, GIT_CLEAN=True, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.42.2, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7) Feb 23 02:38:55 localhost podman[31065]: 2026-02-23 07:38:55.182605147 +0000 UTC m=+0.613278553 container start f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.42.2, version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 02:38:55 localhost podman[31065]: 2026-02-23 07:38:55.182839849 +0000 UTC m=+0.613513245 container attach f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, RELEASE=main, release=1770267347, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True) Feb 23 02:38:55 localhost flamboyant_curie[31081]: { Feb 23 02:38:55 localhost flamboyant_curie[31081]: "2": [ Feb 23 02:38:55 localhost flamboyant_curie[31081]: { Feb 23 02:38:55 localhost flamboyant_curie[31081]: "devices": [ Feb 23 02:38:55 localhost flamboyant_curie[31081]: "/dev/loop3" Feb 23 02:38:55 localhost flamboyant_curie[31081]: ], Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_name": "ceph_lv0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_path": "/dev/ceph_vg0/ceph_lv0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_size": "7511998464", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=QzbSmV-1Ft3-Pm3v-HD9F-5Gxf-hEgf-DOunqH,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=f1fea371-cb69-578d-a3d0-b5c472a84b46,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=3c38c3a7-5c4b-4b97-99e3-119e348f6df6,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_uuid": "QzbSmV-1Ft3-Pm3v-HD9F-5Gxf-hEgf-DOunqH", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "name": "ceph_lv0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "path": "/dev/ceph_vg0/ceph_lv0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "tags": { Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.block_device": "/dev/ceph_vg0/ceph_lv0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.block_uuid": "QzbSmV-1Ft3-Pm3v-HD9F-5Gxf-hEgf-DOunqH", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.cephx_lockbox_secret": "", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.cluster_fsid": "f1fea371-cb69-578d-a3d0-b5c472a84b46", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.cluster_name": "ceph", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.crush_device_class": "", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.encrypted": "0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.osd_fsid": "3c38c3a7-5c4b-4b97-99e3-119e348f6df6", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.osd_id": "2", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.osdspec_affinity": "default_drive_group", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.type": "block", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.vdo": "0" Feb 23 02:38:55 localhost flamboyant_curie[31081]: }, Feb 23 02:38:55 localhost flamboyant_curie[31081]: "type": "block", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "vg_name": "ceph_vg0" Feb 23 02:38:55 localhost flamboyant_curie[31081]: } Feb 23 02:38:55 localhost flamboyant_curie[31081]: ], Feb 23 02:38:55 localhost flamboyant_curie[31081]: "5": [ Feb 23 02:38:55 localhost flamboyant_curie[31081]: { Feb 23 02:38:55 localhost flamboyant_curie[31081]: "devices": [ Feb 23 02:38:55 localhost flamboyant_curie[31081]: "/dev/loop4" Feb 23 02:38:55 localhost flamboyant_curie[31081]: ], Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_name": "ceph_lv1", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_path": "/dev/ceph_vg1/ceph_lv1", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_size": "7511998464", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=viXkli-dRMc-hUAL-qD67-UJ3I-enjo-w2BLkV,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=f1fea371-cb69-578d-a3d0-b5c472a84b46,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=79650a5e-2685-4848-a7c4-7cead1e09ea1,ceph.osd_id=5,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "lv_uuid": "viXkli-dRMc-hUAL-qD67-UJ3I-enjo-w2BLkV", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "name": "ceph_lv1", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "path": "/dev/ceph_vg1/ceph_lv1", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "tags": { Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.block_device": "/dev/ceph_vg1/ceph_lv1", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.block_uuid": "viXkli-dRMc-hUAL-qD67-UJ3I-enjo-w2BLkV", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.cephx_lockbox_secret": "", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.cluster_fsid": "f1fea371-cb69-578d-a3d0-b5c472a84b46", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.cluster_name": "ceph", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.crush_device_class": "", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.encrypted": "0", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.osd_fsid": "79650a5e-2685-4848-a7c4-7cead1e09ea1", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.osd_id": "5", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.osdspec_affinity": "default_drive_group", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.type": "block", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "ceph.vdo": "0" Feb 23 02:38:55 localhost flamboyant_curie[31081]: }, Feb 23 02:38:55 localhost flamboyant_curie[31081]: "type": "block", Feb 23 02:38:55 localhost flamboyant_curie[31081]: "vg_name": "ceph_vg1" Feb 23 02:38:55 localhost flamboyant_curie[31081]: } Feb 23 02:38:55 localhost flamboyant_curie[31081]: ] Feb 23 02:38:55 localhost flamboyant_curie[31081]: } Feb 23 02:38:55 localhost systemd[1]: libpod-f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1.scope: Deactivated successfully. Feb 23 02:38:55 localhost podman[31065]: 2026-02-23 07:38:55.522629414 +0000 UTC m=+0.953302850 container died f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, release=1770267347, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-type=git, version=7, io.openshift.expose-services=, distribution-scope=public) Feb 23 02:38:55 localhost systemd[1]: tmp-crun.2EhrIm.mount: Deactivated successfully. Feb 23 02:38:55 localhost systemd[1]: var-lib-containers-storage-overlay-6e7bc5eefd4eabb95507f861615322272d0998c6323fea80fee47b37ac39fb9b-merged.mount: Deactivated successfully. Feb 23 02:38:55 localhost podman[31090]: 2026-02-23 07:38:55.607859015 +0000 UTC m=+0.075814468 container remove f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_curie, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.openshift.expose-services=) Feb 23 02:38:55 localhost systemd[1]: libpod-conmon-f9d441f2c6fa2a875cf3f1b07b186f42916036919f6a4db4ba4dddeebb2a2bb1.scope: Deactivated successfully. Feb 23 02:38:56 localhost podman[31177]: Feb 23 02:38:56 localhost podman[31177]: 2026-02-23 07:38:56.343921767 +0000 UTC m=+0.070319580 container create 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, release=1770267347, ceph=True, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph) Feb 23 02:38:56 localhost systemd[1]: Started libpod-conmon-942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84.scope. Feb 23 02:38:56 localhost systemd[1]: Started libcrun container. Feb 23 02:38:56 localhost podman[31177]: 2026-02-23 07:38:56.409102127 +0000 UTC m=+0.135499940 container init 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, RELEASE=main, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, ceph=True, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 02:38:56 localhost podman[31177]: 2026-02-23 07:38:56.316190733 +0000 UTC m=+0.042588546 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:56 localhost podman[31177]: 2026-02-23 07:38:56.418974235 +0000 UTC m=+0.145372048 container start 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.expose-services=, release=1770267347, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7) Feb 23 02:38:56 localhost podman[31177]: 2026-02-23 07:38:56.419208707 +0000 UTC m=+0.145606530 container attach 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, name=rhceph, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7) Feb 23 02:38:56 localhost funny_colden[31192]: 167 167 Feb 23 02:38:56 localhost systemd[1]: libpod-942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84.scope: Deactivated successfully. Feb 23 02:38:56 localhost podman[31177]: 2026-02-23 07:38:56.421580651 +0000 UTC m=+0.147978534 container died 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 02:38:56 localhost podman[31197]: 2026-02-23 07:38:56.516911252 +0000 UTC m=+0.083010595 container remove 942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_colden, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.42.2, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=1770267347, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z) Feb 23 02:38:56 localhost systemd[1]: libpod-conmon-942c2fb3150f2fcc2e3f0e57cf1c2f786f36ec8484e24ad57b698b25d1c7ce84.scope: Deactivated successfully. Feb 23 02:38:56 localhost systemd[1]: var-lib-containers-storage-overlay-3dba304662f17f95e27d680ba2f2ef51ce43633d3acbeb54c4928dd0f4233264-merged.mount: Deactivated successfully. Feb 23 02:38:56 localhost podman[31224]: Feb 23 02:38:56 localhost podman[31224]: 2026-02-23 07:38:56.854195616 +0000 UTC m=+0.076771299 container create 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, version=7, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, release=1770267347, RELEASE=main, GIT_BRANCH=main, ceph=True, architecture=x86_64) Feb 23 02:38:56 localhost systemd[1]: Started libpod-conmon-42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2.scope. Feb 23 02:38:56 localhost systemd[1]: Started libcrun container. Feb 23 02:38:56 localhost podman[31224]: 2026-02-23 07:38:56.824785653 +0000 UTC m=+0.047361326 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:56 localhost podman[31224]: 2026-02-23 07:38:56.983035215 +0000 UTC m=+0.205610888 container init 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, vendor=Red Hat, Inc., release=1770267347, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, RELEASE=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, CEPH_POINT_RELEASE=) Feb 23 02:38:56 localhost podman[31224]: 2026-02-23 07:38:56.993984388 +0000 UTC m=+0.216560071 container start 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, description=Red Hat Ceph Storage 7, name=rhceph, release=1770267347, ceph=True, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=) Feb 23 02:38:56 localhost podman[31224]: 2026-02-23 07:38:56.99419587 +0000 UTC m=+0.216771543 container attach 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, release=1770267347, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 02:38:57 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test[31240]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Feb 23 02:38:57 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test[31240]: [--no-systemd] [--no-tmpfs] Feb 23 02:38:57 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test[31240]: ceph-volume activate: error: unrecognized arguments: --bad-option Feb 23 02:38:57 localhost systemd[1]: libpod-42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2.scope: Deactivated successfully. Feb 23 02:38:57 localhost podman[31224]: 2026-02-23 07:38:57.205623211 +0000 UTC m=+0.428198944 container died 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, name=rhceph, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph) Feb 23 02:38:57 localhost podman[31245]: 2026-02-23 07:38:57.279413052 +0000 UTC m=+0.064040250 container remove 42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate-test, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., version=7, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main) Feb 23 02:38:57 localhost systemd-journald[618]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Feb 23 02:38:57 localhost systemd-journald[618]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 02:38:57 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 02:38:57 localhost systemd[1]: libpod-conmon-42208a29ce264b4593c6dd65cea427617bdba7d45838078937dd494e4be83ce2.scope: Deactivated successfully. Feb 23 02:38:57 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 02:38:57 localhost systemd[1]: Reloading. Feb 23 02:38:57 localhost systemd-sysv-generator[31307]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:57 localhost systemd-rc-local-generator[31301]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:57 localhost systemd[1]: var-lib-containers-storage-overlay-7f36cb8087ed81274f1b45435412b2e883469909d555438e5d1c1c13f4447f30-merged.mount: Deactivated successfully. Feb 23 02:38:57 localhost systemd[1]: tmp-crun.tseIDR.mount: Deactivated successfully. Feb 23 02:38:57 localhost systemd[1]: Reloading. Feb 23 02:38:57 localhost systemd-rc-local-generator[31344]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:38:57 localhost systemd-sysv-generator[31350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:38:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:38:58 localhost systemd[1]: Starting Ceph osd.2 for f1fea371-cb69-578d-a3d0-b5c472a84b46... Feb 23 02:38:58 localhost podman[31410]: Feb 23 02:38:58 localhost podman[31410]: 2026-02-23 07:38:58.45535684 +0000 UTC m=+0.075271810 container create 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, distribution-scope=public, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vcs-type=git, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main) Feb 23 02:38:58 localhost systemd[1]: Started libcrun container. Feb 23 02:38:58 localhost podman[31410]: 2026-02-23 07:38:58.423658797 +0000 UTC m=+0.043573767 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:58 localhost podman[31410]: 2026-02-23 07:38:58.584178018 +0000 UTC m=+0.204092998 container init 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , RELEASE=main, distribution-scope=public, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 02:38:58 localhost podman[31410]: 2026-02-23 07:38:58.594352552 +0000 UTC m=+0.214267532 container start 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, release=1770267347, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z) Feb 23 02:38:58 localhost podman[31410]: 2026-02-23 07:38:58.594584564 +0000 UTC m=+0.214499574 container attach 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, ceph=True, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64) Feb 23 02:38:59 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Feb 23 02:38:59 localhost bash[31410]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Feb 23 02:38:59 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Feb 23 02:38:59 localhost bash[31410]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Feb 23 02:38:59 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Feb 23 02:38:59 localhost bash[31410]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Feb 23 02:38:59 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 23 02:38:59 localhost bash[31410]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Feb 23 02:38:59 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block Feb 23 02:38:59 localhost bash[31410]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block Feb 23 02:38:59 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Feb 23 02:38:59 localhost bash[31410]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 Feb 23 02:38:59 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate[31424]: --> ceph-volume raw activate successful for osd ID: 2 Feb 23 02:38:59 localhost bash[31410]: --> ceph-volume raw activate successful for osd ID: 2 Feb 23 02:38:59 localhost systemd[1]: libpod-27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41.scope: Deactivated successfully. Feb 23 02:38:59 localhost podman[31410]: 2026-02-23 07:38:59.347653189 +0000 UTC m=+0.967568149 container died 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, RELEASE=main, io.openshift.tags=rhceph ceph, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-type=git) Feb 23 02:38:59 localhost systemd[1]: var-lib-containers-storage-overlay-ce8a25d5b5f74137ce6b7cf3b305bb834daae40d6b3bf105988c1aa66548c3ed-merged.mount: Deactivated successfully. Feb 23 02:38:59 localhost podman[31554]: 2026-02-23 07:38:59.433315482 +0000 UTC m=+0.073312727 container remove 27737286289b4f8ecf5f7e7d41ff1505e11b4ac91e778dac58f59d9cdf9ebe41 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2-activate, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.expose-services=, version=7, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 02:38:59 localhost podman[31615]: Feb 23 02:38:59 localhost podman[31615]: 2026-02-23 07:38:59.755111333 +0000 UTC m=+0.073650254 container create 862eadaff641589ceb245e67477cf75d6f44dd8a2e370794aa63510852a63e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, io.openshift.expose-services=, ceph=True, CEPH_POINT_RELEASE=, release=1770267347, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, architecture=x86_64) Feb 23 02:38:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:59 localhost podman[31615]: 2026-02-23 07:38:59.727471873 +0000 UTC m=+0.046010824 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:38:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5137b21979481ad80623953c6db1be153fc037ec2d4ab0d00401398e0ddaecd5/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff) Feb 23 02:38:59 localhost podman[31615]: 2026-02-23 07:38:59.878944309 +0000 UTC m=+0.197483240 container init 862eadaff641589ceb245e67477cf75d6f44dd8a2e370794aa63510852a63e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.42.2, release=1770267347, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z) Feb 23 02:38:59 localhost podman[31615]: 2026-02-23 07:38:59.88754034 +0000 UTC m=+0.206079271 container start 862eadaff641589ceb245e67477cf75d6f44dd8a2e370794aa63510852a63e9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.42.2, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 02:38:59 localhost bash[31615]: 862eadaff641589ceb245e67477cf75d6f44dd8a2e370794aa63510852a63e9d Feb 23 02:38:59 localhost systemd[1]: Started Ceph osd.2 for f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 02:38:59 localhost ceph-osd[31633]: set uid:gid to 167:167 (ceph:ceph) Feb 23 02:38:59 localhost ceph-osd[31633]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-osd, pid 2 Feb 23 02:38:59 localhost ceph-osd[31633]: pidfile_write: ignore empty --pid-file Feb 23 02:38:59 localhost ceph-osd[31633]: bdev(0x557956c84e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Feb 23 02:38:59 localhost ceph-osd[31633]: bdev(0x557956c84e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Feb 23 02:38:59 localhost ceph-osd[31633]: bdev(0x557956c84e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:38:59 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 23 02:38:59 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Feb 23 02:38:59 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Feb 23 02:38:59 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:38:59 localhost ceph-osd[31633]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB Feb 23 02:38:59 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) close Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c84e00 /var/lib/ceph/osd/ceph-2/block) close Feb 23 02:39:00 localhost ceph-osd[31633]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal Feb 23 02:39:00 localhost ceph-osd[31633]: load: jerasure load: lrc Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:00 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) close Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:00 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 23 02:39:00 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) close Feb 23 02:39:00 localhost podman[31726]: Feb 23 02:39:00 localhost podman[31726]: 2026-02-23 07:39:00.803737172 +0000 UTC m=+0.067161584 container create 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , ceph=True, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container) Feb 23 02:39:00 localhost systemd[1]: Started libpod-conmon-82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be.scope. Feb 23 02:39:00 localhost podman[31726]: 2026-02-23 07:39:00.77452016 +0000 UTC m=+0.037944572 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:39:00 localhost systemd[1]: Started libcrun container. Feb 23 02:39:00 localhost podman[31726]: 2026-02-23 07:39:00.892626495 +0000 UTC m=+0.156050907 container init 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, distribution-scope=public, release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main) Feb 23 02:39:00 localhost systemd[1]: tmp-crun.QghnOo.mount: Deactivated successfully. Feb 23 02:39:00 localhost podman[31726]: 2026-02-23 07:39:00.904690428 +0000 UTC m=+0.168114840 container start 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 02:39:00 localhost podman[31726]: 2026-02-23 07:39:00.905169693 +0000 UTC m=+0.168594105 container attach 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, io.buildah.version=1.42.2, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1770267347, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main) Feb 23 02:39:00 localhost magical_gould[31745]: 167 167 Feb 23 02:39:00 localhost systemd[1]: libpod-82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be.scope: Deactivated successfully. Feb 23 02:39:00 localhost podman[31726]: 2026-02-23 07:39:00.909023565 +0000 UTC m=+0.172448047 container died 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, distribution-scope=public, GIT_CLEAN=True, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, release=1770267347, build-date=2026-02-09T10:25:24Z, architecture=x86_64, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 02:39:01 localhost podman[31750]: 2026-02-23 07:39:01.005406011 +0000 UTC m=+0.081117406 container remove 82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_gould, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 02:39:01 localhost systemd[1]: libpod-conmon-82d3efef4415469d7279043f8766d4c08df298d22f726e60fc08640edaa6c0be.scope: Deactivated successfully. Feb 23 02:39:01 localhost ceph-osd[31633]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Feb 23 02:39:01 localhost ceph-osd[31633]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs mount Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs mount shared_bdev_used = 0 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: RocksDB version: 7.9.2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Git sha 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: DB SUMMARY Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: DB Session ID: T252W6HLUGEFWZ0H917R Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: CURRENT file: CURRENT Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: IDENTITY file: IDENTITY Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.error_if_exists: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.create_if_missing: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.env: 0x557957a9fe30 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.fs: LegacyFileSystem Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.info_log: 0x557956cc6740 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_file_opening_threads: 16 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.statistics: (nil) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.use_fsync: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_log_file_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.log_file_time_to_roll: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.keep_log_file_num: 1000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.recycle_log_file_num: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_fallocate: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_mmap_reads: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_mmap_writes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.use_direct_reads: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.create_missing_column_families: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.db_log_dir: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_dir: db.wal Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_cache_numshardbits: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.advise_random_on_open: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.db_write_buffer_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_manager: 0x557956c6f4a0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.use_adaptive_mutex: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.rate_limiter: (nil) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_recovery_mode: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_thread_tracking: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_pipelined_write: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.unordered_write: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.row_cache: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_ingest_behind: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.two_write_queues: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.manual_wal_flush: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_compression: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.atomic_flush: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.persist_stats_to_disk: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.log_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.best_efforts_recovery: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_data_in_errors: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.db_host_id: __hostname__ Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enforce_single_del_contracts: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_background_jobs: 4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_background_compactions: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_subcompactions: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.delayed_write_rate : 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.stats_dump_period_sec: 600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.stats_persist_period_sec: 600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_open_files: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bytes_per_sync: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_background_flushes: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Compression algorithms supported: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kZSTD supported: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kXpressCompression supported: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kBZip2Compression supported: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kLZ4Compression supported: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kZlibCompression supported: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kSnappyCompression supported: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: DMutex implementation: pthread_mutex_t Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6900)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 30259abc-06cf-4764-a5f7-1c462702ec25 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341059543, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341059832, "job": 1, "event": "recovery_finished"} Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000 Feb 23 02:39:01 localhost ceph-osd[31633]: freelist init Feb 23 02:39:01 localhost ceph-osd[31633]: freelist _read_cfg Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs umount Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) close Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument Feb 23 02:39:01 localhost ceph-osd[31633]: bdev(0x557956c85500 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs mount Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 23 02:39:01 localhost ceph-osd[31633]: bluefs mount shared_bdev_used = 4718592 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: RocksDB version: 7.9.2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Git sha 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: DB SUMMARY Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: DB Session ID: T252W6HLUGEFWZ0H917Q Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: CURRENT file: CURRENT Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: IDENTITY file: IDENTITY Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.error_if_exists: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.create_if_missing: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.env: 0x557957ac0380 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.fs: LegacyFileSystem Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.info_log: 0x557956cc68c0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_file_opening_threads: 16 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.statistics: (nil) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.use_fsync: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_log_file_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.log_file_time_to_roll: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.keep_log_file_num: 1000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.recycle_log_file_num: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_fallocate: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_mmap_reads: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_mmap_writes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.use_direct_reads: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.create_missing_column_families: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.db_log_dir: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_dir: db.wal Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_cache_numshardbits: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.advise_random_on_open: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.db_write_buffer_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_manager: 0x557956c6f4a0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.use_adaptive_mutex: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.rate_limiter: (nil) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_recovery_mode: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_thread_tracking: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_pipelined_write: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.unordered_write: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.row_cache: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_ingest_behind: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.two_write_queues: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.manual_wal_flush: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_compression: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.atomic_flush: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.persist_stats_to_disk: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.log_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.best_efforts_recovery: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.allow_data_in_errors: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.db_host_id: __hostname__ Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enforce_single_del_contracts: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_background_jobs: 4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_background_compactions: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_subcompactions: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.delayed_write_rate : 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.stats_dump_period_sec: 600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.stats_persist_period_sec: 600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_open_files: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bytes_per_sync: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_background_flushes: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Compression algorithms supported: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kZSTD supported: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kXpressCompression supported: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kBZip2Compression supported: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kLZ4Compression supported: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kZlibCompression supported: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: #011kSnappyCompression supported: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: DMutex implementation: pthread_mutex_t Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557956cc6b20)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557957c53380)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557957c53380)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.merge_operator: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x557957c53380)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x557956c5c2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression: LZ4 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.num_levels: 7 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 30259abc-06cf-4764-a5f7-1c462702ec25 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341339176, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341344340, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832341, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "30259abc-06cf-4764-a5f7-1c462702ec25", "db_session_id": "T252W6HLUGEFWZ0H917Q", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341348734, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832341, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "30259abc-06cf-4764-a5f7-1c462702ec25", "db_session_id": "T252W6HLUGEFWZ0H917Q", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341353551, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832341, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "30259abc-06cf-4764-a5f7-1c462702ec25", "db_session_id": "T252W6HLUGEFWZ0H917Q", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832341357946, "job": 1, "event": "recovery_finished"} Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Feb 23 02:39:01 localhost podman[31971]: Feb 23 02:39:01 localhost podman[31971]: 2026-02-23 07:39:01.369306731 +0000 UTC m=+0.089500016 container create b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7) Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x557956d1e700 Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: DB pointer 0x557956da3a00 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4 Feb 23 02:39:01 localhost ceph-osd[31633]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 02:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012 Feb 23 02:39:01 localhost ceph-osd[31633]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Feb 23 02:39:01 localhost ceph-osd[31633]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Feb 23 02:39:01 localhost ceph-osd[31633]: _get_class not permitted to load lua Feb 23 02:39:01 localhost ceph-osd[31633]: _get_class not permitted to load sdk Feb 23 02:39:01 localhost ceph-osd[31633]: _get_class not permitted to load test_remote_reads Feb 23 02:39:01 localhost ceph-osd[31633]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients Feb 23 02:39:01 localhost ceph-osd[31633]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Feb 23 02:39:01 localhost ceph-osd[31633]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds Feb 23 02:39:01 localhost ceph-osd[31633]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Feb 23 02:39:01 localhost ceph-osd[31633]: osd.2 0 load_pgs Feb 23 02:39:01 localhost ceph-osd[31633]: osd.2 0 load_pgs opened 0 pgs Feb 23 02:39:01 localhost ceph-osd[31633]: osd.2 0 log_to_monitors true Feb 23 02:39:01 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2[31629]: 2026-02-23T07:39:01.397+0000 7feb3f211a80 -1 osd.2 0 log_to_monitors true Feb 23 02:39:01 localhost systemd[1]: Started libpod-conmon-b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db.scope. Feb 23 02:39:01 localhost systemd[1]: Started libcrun container. Feb 23 02:39:01 localhost podman[31971]: 2026-02-23 07:39:01.339147319 +0000 UTC m=+0.059340604 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:39:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:01 localhost podman[31971]: 2026-02-23 07:39:01.502255606 +0000 UTC m=+0.222448871 container init b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, version=7, GIT_BRANCH=main, ceph=True, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.42.2, GIT_CLEAN=True) Feb 23 02:39:01 localhost podman[31971]: 2026-02-23 07:39:01.512987609 +0000 UTC m=+0.233180904 container start b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, vcs-type=git, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 02:39:01 localhost podman[31971]: 2026-02-23 07:39:01.513259243 +0000 UTC m=+0.233452538 container attach b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, distribution-scope=public, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux , release=1770267347, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True) Feb 23 02:39:01 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test[32200]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Feb 23 02:39:01 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test[32200]: [--no-systemd] [--no-tmpfs] Feb 23 02:39:01 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test[32200]: ceph-volume activate: error: unrecognized arguments: --bad-option Feb 23 02:39:01 localhost systemd[1]: libpod-b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db.scope: Deactivated successfully. Feb 23 02:39:01 localhost podman[31971]: 2026-02-23 07:39:01.773380221 +0000 UTC m=+0.493573506 container died b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, version=7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 02:39:01 localhost systemd[1]: var-lib-containers-storage-overlay-5a37738ab23f69cfa016b219998c51db9748d85f1b0540dfceedf387378eb99f-merged.mount: Deactivated successfully. Feb 23 02:39:01 localhost systemd[1]: tmp-crun.uBmgG4.mount: Deactivated successfully. Feb 23 02:39:01 localhost systemd[1]: var-lib-containers-storage-overlay-4deba69c444be98bfd31554cd680a1f083fb003199d093505109e816a6f9290d-merged.mount: Deactivated successfully. Feb 23 02:39:01 localhost podman[32205]: 2026-02-23 07:39:01.86164332 +0000 UTC m=+0.077261134 container remove b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate-test, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, RELEASE=main, name=rhceph) Feb 23 02:39:01 localhost systemd[1]: libpod-conmon-b07f9afdcc33ffaa0bd9ee09509fb8ff2a7b04ccc6db6917389d8b3278c460db.scope: Deactivated successfully. Feb 23 02:39:02 localhost systemd[1]: Reloading. Feb 23 02:39:02 localhost systemd-rc-local-generator[32257]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:39:02 localhost systemd-sysv-generator[32264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:39:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:39:02 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Feb 23 02:39:02 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Feb 23 02:39:02 localhost systemd[1]: Reloading. Feb 23 02:39:02 localhost systemd-sysv-generator[32308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:39:02 localhost systemd-rc-local-generator[32303]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:39:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:39:02 localhost systemd[1]: Starting Ceph osd.5 for f1fea371-cb69-578d-a3d0-b5c472a84b46... Feb 23 02:39:03 localhost podman[32366]: Feb 23 02:39:03 localhost podman[32366]: 2026-02-23 07:39:03.109811445 +0000 UTC m=+0.082519130 container create 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, version=7, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., release=1770267347, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z) Feb 23 02:39:03 localhost systemd[1]: tmp-crun.qguPYS.mount: Deactivated successfully. Feb 23 02:39:03 localhost podman[32366]: 2026-02-23 07:39:03.07653387 +0000 UTC m=+0.049241555 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:39:03 localhost systemd[1]: Started libcrun container. Feb 23 02:39:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:03 localhost ceph-osd[31633]: osd.2 0 done with init, starting boot process Feb 23 02:39:03 localhost ceph-osd[31633]: osd.2 0 start_boot Feb 23 02:39:03 localhost ceph-osd[31633]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1 Feb 23 02:39:03 localhost ceph-osd[31633]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Feb 23 02:39:03 localhost ceph-osd[31633]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Feb 23 02:39:03 localhost ceph-osd[31633]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Feb 23 02:39:03 localhost ceph-osd[31633]: osd.2 0 bench count 12288000 bsize 4 KiB Feb 23 02:39:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:03 localhost podman[32366]: 2026-02-23 07:39:03.26585109 +0000 UTC m=+0.238558765 container init 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container) Feb 23 02:39:03 localhost podman[32366]: 2026-02-23 07:39:03.291990352 +0000 UTC m=+0.264698027 container start 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 02:39:03 localhost podman[32366]: 2026-02-23 07:39:03.292312789 +0000 UTC m=+0.265020464 container attach 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, build-date=2026-02-09T10:25:24Z, RELEASE=main, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph) Feb 23 02:39:03 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Feb 23 02:39:03 localhost bash[32366]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Feb 23 02:39:03 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Feb 23 02:39:03 localhost bash[32366]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Feb 23 02:39:04 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Feb 23 02:39:04 localhost bash[32366]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Feb 23 02:39:04 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 23 02:39:04 localhost bash[32366]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Feb 23 02:39:04 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:04 localhost bash[32366]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:04 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Feb 23 02:39:04 localhost bash[32366]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 Feb 23 02:39:04 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate[32380]: --> ceph-volume raw activate successful for osd ID: 5 Feb 23 02:39:04 localhost bash[32366]: --> ceph-volume raw activate successful for osd ID: 5 Feb 23 02:39:04 localhost systemd[1]: libpod-8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077.scope: Deactivated successfully. Feb 23 02:39:04 localhost podman[32366]: 2026-02-23 07:39:04.071314804 +0000 UTC m=+1.044022459 container died 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , architecture=x86_64, ceph=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-type=git, distribution-scope=public) Feb 23 02:39:04 localhost systemd[1]: tmp-crun.A2j4RE.mount: Deactivated successfully. Feb 23 02:39:04 localhost systemd[26179]: Starting Mark boot as successful... Feb 23 02:39:04 localhost systemd[1]: var-lib-containers-storage-overlay-e0e38258921875b1b49a9719926b0ea2472d1235d7f77c36277a39f70203edc0-merged.mount: Deactivated successfully. Feb 23 02:39:04 localhost systemd[26179]: Finished Mark boot as successful. Feb 23 02:39:04 localhost podman[32495]: 2026-02-23 07:39:04.188064468 +0000 UTC m=+0.104364166 container remove 8345c25358efb72de5c480aed55d8b89f911e3fbc376f576e5f5cda716717077 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5-activate, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, maintainer=Guillaume Abrioux ) Feb 23 02:39:04 localhost podman[32556]: Feb 23 02:39:04 localhost podman[32556]: 2026-02-23 07:39:04.56438526 +0000 UTC m=+0.091005126 container create 6682f631389f5fea34334a28d29db8aea85f2971ab07e24b48970944e80cac0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 02:39:04 localhost podman[32556]: 2026-02-23 07:39:04.524008141 +0000 UTC m=+0.050628027 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:39:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/646d94a2aef9f44152fb31e8a85ecadbceb6f276fba1497b89b238712cecdceb/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:04 localhost podman[32556]: 2026-02-23 07:39:04.698105245 +0000 UTC m=+0.224725111 container init 6682f631389f5fea34334a28d29db8aea85f2971ab07e24b48970944e80cac0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main) Feb 23 02:39:04 localhost podman[32556]: 2026-02-23 07:39:04.71803438 +0000 UTC m=+0.244654236 container start 6682f631389f5fea34334a28d29db8aea85f2971ab07e24b48970944e80cac0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, maintainer=Guillaume Abrioux , ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 02:39:04 localhost bash[32556]: 6682f631389f5fea34334a28d29db8aea85f2971ab07e24b48970944e80cac0e Feb 23 02:39:04 localhost systemd[1]: Started Ceph osd.5 for f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 02:39:04 localhost ceph-osd[32575]: set uid:gid to 167:167 (ceph:ceph) Feb 23 02:39:04 localhost ceph-osd[32575]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-osd, pid 2 Feb 23 02:39:04 localhost ceph-osd[32575]: pidfile_write: ignore empty --pid-file Feb 23 02:39:04 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:04 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Feb 23 02:39:04 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:04 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 23 02:39:04 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:04 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Feb 23 02:39:04 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:04 localhost ceph-osd[32575]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB Feb 23 02:39:04 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) close Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) close Feb 23 02:39:05 localhost ceph-osd[32575]: starting osd.5 osd_data /var/lib/ceph/osd/ceph-5 /var/lib/ceph/osd/ceph-5/journal Feb 23 02:39:05 localhost ceph-osd[32575]: load: jerasure load: lrc Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) close Feb 23 02:39:05 localhost podman[32661]: Feb 23 02:39:05 localhost podman[32661]: 2026-02-23 07:39:05.57416127 +0000 UTC m=+0.080703824 container create d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, build-date=2026-02-09T10:25:24Z) Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) close Feb 23 02:39:05 localhost systemd[1]: Started libpod-conmon-d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8.scope. Feb 23 02:39:05 localhost podman[32661]: 2026-02-23 07:39:05.53832073 +0000 UTC m=+0.044863274 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:39:05 localhost systemd[1]: Started libcrun container. Feb 23 02:39:05 localhost podman[32661]: 2026-02-23 07:39:05.667406263 +0000 UTC m=+0.173948877 container init d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph) Feb 23 02:39:05 localhost podman[32661]: 2026-02-23 07:39:05.676406495 +0000 UTC m=+0.182949039 container start d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public) Feb 23 02:39:05 localhost podman[32661]: 2026-02-23 07:39:05.676612945 +0000 UTC m=+0.183155489 container attach d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True) Feb 23 02:39:05 localhost lucid_swartz[32679]: 167 167 Feb 23 02:39:05 localhost systemd[1]: libpod-d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8.scope: Deactivated successfully. Feb 23 02:39:05 localhost podman[32661]: 2026-02-23 07:39:05.682328005 +0000 UTC m=+0.188870579 container died d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main, io.buildah.version=1.42.2, GIT_BRANCH=main) Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 30.720 iops: 7864.273 elapsed_sec: 0.381 Feb 23 02:39:05 localhost ceph-osd[31633]: log_channel(cluster) log [WRN] : OSD bench result of 7864.273296 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 0 waiting for initial osdmap Feb 23 02:39:05 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2[31629]: 2026-02-23T07:39:05.686+0000 7feb3b9a5640 -1 osd.2 0 waiting for initial osdmap Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 12 crush map has features 288514050185494528, adjusting msgr requires for clients Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 12 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 12 crush map has features 3314932999778484224, adjusting msgr requires for osds Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 12 check_osdmap_features require_osd_release unknown -> reef Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 12 set_numa_affinity not setting numa affinity Feb 23 02:39:05 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-2[31629]: 2026-02-23T07:39:05.705+0000 7feb367ba640 -1 osd.2 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 23 02:39:05 localhost ceph-osd[31633]: osd.2 12 _collect_metadata loop3: no unique device id for loop3: fallback method has no model nor serial Feb 23 02:39:05 localhost podman[32684]: 2026-02-23 07:39:05.790012784 +0000 UTC m=+0.091857030 container remove d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_swartz, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, release=1770267347, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2) Feb 23 02:39:05 localhost systemd[1]: libpod-conmon-d1408fe2c24d5ba44d9a845c6d3a4b4c1720109216931c1aeab55fcbf64375a8.scope: Deactivated successfully. Feb 23 02:39:05 localhost ceph-osd[32575]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Feb 23 02:39:05 localhost ceph-osd[32575]: osd.5:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612ce00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:05 localhost ceph-osd[32575]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB Feb 23 02:39:05 localhost ceph-osd[32575]: bluefs mount Feb 23 02:39:05 localhost ceph-osd[32575]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 23 02:39:05 localhost ceph-osd[32575]: bluefs mount shared_bdev_used = 0 Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: RocksDB version: 7.9.2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Git sha 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: DB SUMMARY Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: DB Session ID: VGZ6RC4M9VF4RKBK61PE Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: CURRENT file: CURRENT Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: IDENTITY file: IDENTITY Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.error_if_exists: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.create_if_missing: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.env: 0x564b56f51c70 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.fs: LegacyFileSystem Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.info_log: 0x564b570d07e0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_file_opening_threads: 16 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.statistics: (nil) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.use_fsync: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_log_file_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.log_file_time_to_roll: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.keep_log_file_num: 1000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.recycle_log_file_num: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.allow_fallocate: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.allow_mmap_reads: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.allow_mmap_writes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.use_direct_reads: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.create_missing_column_families: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.db_log_dir: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.wal_dir: db.wal Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_cache_numshardbits: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.advise_random_on_open: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.db_write_buffer_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_manager: 0x564b56116140 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.use_adaptive_mutex: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.rate_limiter: (nil) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.wal_recovery_mode: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_thread_tracking: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_pipelined_write: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.unordered_write: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.row_cache: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.wal_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.allow_ingest_behind: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.two_write_queues: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.manual_wal_flush: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.wal_compression: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.atomic_flush: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.persist_stats_to_disk: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.log_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.best_efforts_recovery: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.allow_data_in_errors: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.db_host_id: __hostname__ Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enforce_single_del_contracts: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_background_jobs: 4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_background_compactions: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_subcompactions: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.delayed_write_rate : 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.stats_dump_period_sec: 600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.stats_persist_period_sec: 600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_open_files: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bytes_per_sync: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_background_flushes: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Compression algorithms supported: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: #011kZSTD supported: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: #011kXpressCompression supported: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: #011kBZip2Compression supported: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: #011kLZ4Compression supported: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: #011kZlibCompression supported: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: #011kSnappyCompression supported: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: DMutex implementation: pthread_mutex_t Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56104850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56104850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56104850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56104850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56104850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56104850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d09a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56104850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d0bc0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d0bc0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b570d0bc0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fbdd6168-4234-4c0e-b38b-7b6ed96d1042 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832345888137, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832345888404, "job": 1, "event": "recovery_finished"} Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old nid_max 1025 Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old blobid_max 10240 Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta min_alloc_size 0x1000 Feb 23 02:39:05 localhost ceph-osd[32575]: freelist init Feb 23 02:39:05 localhost ceph-osd[32575]: freelist _read_cfg Feb 23 02:39:05 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Feb 23 02:39:05 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Feb 23 02:39:05 localhost ceph-osd[32575]: bluefs umount Feb 23 02:39:05 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) close Feb 23 02:39:06 localhost podman[32901]: Feb 23 02:39:06 localhost podman[32901]: 2026-02-23 07:39:06.018779635 +0000 UTC m=+0.081642084 container create 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, distribution-scope=public, RELEASE=main, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git) Feb 23 02:39:06 localhost systemd[1]: Started libpod-conmon-84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564.scope. Feb 23 02:39:06 localhost systemd[1]: Started libcrun container. Feb 23 02:39:06 localhost podman[32901]: 2026-02-23 07:39:05.989455126 +0000 UTC m=+0.052317565 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:39:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d19888b42ed47ae85e70be871f8597528c5ba300873456929a49fbcdee1aad66/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d19888b42ed47ae85e70be871f8597528c5ba300873456929a49fbcdee1aad66/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d19888b42ed47ae85e70be871f8597528c5ba300873456929a49fbcdee1aad66/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:06 localhost podman[32901]: 2026-02-23 07:39:06.134170878 +0000 UTC m=+0.197033317 container init 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64) Feb 23 02:39:06 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block Feb 23 02:39:06 localhost systemd[1]: var-lib-containers-storage-overlay-3cac4e07189b1dadf83d6b52eea540775735b7fd67deefac60dc012da60a41b5-merged.mount: Deactivated successfully. Feb 23 02:39:06 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument Feb 23 02:39:06 localhost ceph-osd[32575]: bdev(0x564b5612d180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Feb 23 02:39:06 localhost ceph-osd[32575]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB Feb 23 02:39:06 localhost ceph-osd[32575]: bluefs mount Feb 23 02:39:06 localhost ceph-osd[32575]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Feb 23 02:39:06 localhost ceph-osd[32575]: bluefs mount shared_bdev_used = 4718592 Feb 23 02:39:06 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Feb 23 02:39:06 localhost podman[32901]: 2026-02-23 07:39:06.1506109 +0000 UTC m=+0.213473319 container start 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z) Feb 23 02:39:06 localhost podman[32901]: 2026-02-23 07:39:06.150842812 +0000 UTC m=+0.213705231 container attach 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.42.2, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, release=1770267347, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc.) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: RocksDB version: 7.9.2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Git sha 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: DB SUMMARY Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: DB Session ID: VGZ6RC4M9VF4RKBK61PF Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: CURRENT file: CURRENT Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: IDENTITY file: IDENTITY Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.error_if_exists: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.create_if_missing: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.env: 0x564b562522a0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.fs: LegacyFileSystem Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.info_log: 0x564b571440c0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_file_opening_threads: 16 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.statistics: (nil) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.use_fsync: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_log_file_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.log_file_time_to_roll: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.keep_log_file_num: 1000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.recycle_log_file_num: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.allow_fallocate: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.allow_mmap_reads: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.allow_mmap_writes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.use_direct_reads: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.create_missing_column_families: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.db_log_dir: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.wal_dir: db.wal Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_cache_numshardbits: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.advise_random_on_open: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.db_write_buffer_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_manager: 0x564b56116140 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.use_adaptive_mutex: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.rate_limiter: (nil) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.wal_recovery_mode: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_thread_tracking: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_pipelined_write: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.unordered_write: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.row_cache: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.wal_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.allow_ingest_behind: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.two_write_queues: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.manual_wal_flush: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.wal_compression: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.atomic_flush: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.persist_stats_to_disk: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.log_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.best_efforts_recovery: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.allow_data_in_errors: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.db_host_id: __hostname__ Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enforce_single_del_contracts: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_background_jobs: 4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_background_compactions: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_subcompactions: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.writable_file_max_buffer_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.delayed_write_rate : 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_total_wal_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.stats_dump_period_sec: 600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.stats_persist_period_sec: 600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_open_files: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bytes_per_sync: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_readahead_size: 2097152 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_background_flushes: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Compression algorithms supported: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: #011kZSTD supported: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: #011kXpressCompression supported: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: #011kBZip2Compression supported: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: #011kLZ4Compression supported: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: #011kZlibCompression supported: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: #011kSnappyCompression supported: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: DMutex implementation: pthread_mutex_t Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b57144280)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b561042d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b571444c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56105610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b571444c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56105610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.merge_operator: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_filter_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.sst_partitioner_factory: None Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564b571444c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564b56105610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.write_buffer_size: 16777216 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number: 64 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression: LZ4 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression: Disabled Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.num_levels: 7 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.level: 32767 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.enabled: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.arena_block_size: 1048576 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_support: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.bloom_locality: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.max_successive_merges: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.force_consistency_checks: 1 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.ttl: 2592000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_files: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.min_blob_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_size: 268435456 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: fbdd6168-4234-4c0e-b38b-7b6ed96d1042 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346200632, "job": 1, "event": "recovery_started", "wal_files": [31]} Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346207914, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832346, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fbdd6168-4234-4c0e-b38b-7b6ed96d1042", "db_session_id": "VGZ6RC4M9VF4RKBK61PF", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346213050, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832346, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fbdd6168-4234-4c0e-b38b-7b6ed96d1042", "db_session_id": "VGZ6RC4M9VF4RKBK61PF", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346217702, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771832346, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "fbdd6168-4234-4c0e-b38b-7b6ed96d1042", "db_session_id": "VGZ6RC4M9VF4RKBK61PF", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771832346222637, "job": 1, "event": "recovery_finished"} Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x564b56242700 Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: DB pointer 0x564b57025a00 Feb 23 02:39:06 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Feb 23 02:39:06 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super from 4, latest 4 Feb 23 02:39:06 localhost ceph-osd[32575]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super done Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 02:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 460.80 MB usage: 0 Feb 23 02:39:06 localhost ceph-osd[32575]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Feb 23 02:39:06 localhost ceph-osd[32575]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Feb 23 02:39:06 localhost ceph-osd[32575]: _get_class not permitted to load lua Feb 23 02:39:06 localhost ceph-osd[32575]: _get_class not permitted to load sdk Feb 23 02:39:06 localhost ceph-osd[32575]: _get_class not permitted to load test_remote_reads Feb 23 02:39:06 localhost ceph-osd[32575]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for clients Feb 23 02:39:06 localhost ceph-osd[32575]: osd.5 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Feb 23 02:39:06 localhost ceph-osd[32575]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for osds Feb 23 02:39:06 localhost ceph-osd[32575]: osd.5 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Feb 23 02:39:06 localhost ceph-osd[32575]: osd.5 0 load_pgs Feb 23 02:39:06 localhost ceph-osd[32575]: osd.5 0 load_pgs opened 0 pgs Feb 23 02:39:06 localhost ceph-osd[32575]: osd.5 0 log_to_monitors true Feb 23 02:39:06 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5[32571]: 2026-02-23T07:39:06.268+0000 7f8bc82f7a80 -1 osd.5 0 log_to_monitors true Feb 23 02:39:06 localhost ceph-osd[31633]: osd.2 13 state: booting -> active Feb 23 02:39:06 localhost frosty_lewin[32916]: { Feb 23 02:39:06 localhost frosty_lewin[32916]: "3c38c3a7-5c4b-4b97-99e3-119e348f6df6": { Feb 23 02:39:06 localhost frosty_lewin[32916]: "ceph_fsid": "f1fea371-cb69-578d-a3d0-b5c472a84b46", Feb 23 02:39:06 localhost frosty_lewin[32916]: "device": "/dev/mapper/ceph_vg0-ceph_lv0", Feb 23 02:39:06 localhost frosty_lewin[32916]: "osd_id": 2, Feb 23 02:39:06 localhost frosty_lewin[32916]: "osd_uuid": "3c38c3a7-5c4b-4b97-99e3-119e348f6df6", Feb 23 02:39:06 localhost frosty_lewin[32916]: "type": "bluestore" Feb 23 02:39:06 localhost frosty_lewin[32916]: }, Feb 23 02:39:06 localhost frosty_lewin[32916]: "79650a5e-2685-4848-a7c4-7cead1e09ea1": { Feb 23 02:39:06 localhost frosty_lewin[32916]: "ceph_fsid": "f1fea371-cb69-578d-a3d0-b5c472a84b46", Feb 23 02:39:06 localhost frosty_lewin[32916]: "device": "/dev/mapper/ceph_vg1-ceph_lv1", Feb 23 02:39:06 localhost frosty_lewin[32916]: "osd_id": 5, Feb 23 02:39:06 localhost frosty_lewin[32916]: "osd_uuid": "79650a5e-2685-4848-a7c4-7cead1e09ea1", Feb 23 02:39:06 localhost frosty_lewin[32916]: "type": "bluestore" Feb 23 02:39:06 localhost frosty_lewin[32916]: } Feb 23 02:39:06 localhost frosty_lewin[32916]: } Feb 23 02:39:06 localhost systemd[1]: libpod-84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564.scope: Deactivated successfully. Feb 23 02:39:06 localhost podman[32901]: 2026-02-23 07:39:06.788509204 +0000 UTC m=+0.851371683 container died 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph) Feb 23 02:39:06 localhost systemd[1]: var-lib-containers-storage-overlay-d19888b42ed47ae85e70be871f8597528c5ba300873456929a49fbcdee1aad66-merged.mount: Deactivated successfully. Feb 23 02:39:06 localhost podman[33168]: 2026-02-23 07:39:06.890688373 +0000 UTC m=+0.092616389 container remove 84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lewin, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64) Feb 23 02:39:06 localhost systemd[1]: libpod-conmon-84814a137f6f0e56f29f34478c6746623b245c6e6498e993e1ad5f21a1158564.scope: Deactivated successfully. Feb 23 02:39:07 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Feb 23 02:39:07 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Feb 23 02:39:07 localhost ceph-osd[32575]: osd.5 0 done with init, starting boot process Feb 23 02:39:07 localhost ceph-osd[32575]: osd.5 0 start_boot Feb 23 02:39:07 localhost ceph-osd[32575]: osd.5 0 maybe_override_options_for_qos osd_max_backfills set to 1 Feb 23 02:39:07 localhost ceph-osd[32575]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Feb 23 02:39:07 localhost ceph-osd[32575]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Feb 23 02:39:07 localhost ceph-osd[32575]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Feb 23 02:39:07 localhost ceph-osd[32575]: osd.5 0 bench count 12288000 bsize 4 KiB Feb 23 02:39:08 localhost podman[33297]: 2026-02-23 07:39:08.507561801 +0000 UTC m=+0.110880327 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , RELEASE=main) Feb 23 02:39:08 localhost podman[33297]: 2026-02-23 07:39:08.630634108 +0000 UTC m=+0.233952614 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, release=1770267347, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public) Feb 23 02:39:08 localhost ceph-osd[31633]: osd.2 15 crush map has features 288514051259236352, adjusting msgr requires for clients Feb 23 02:39:08 localhost ceph-osd[31633]: osd.2 15 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons Feb 23 02:39:08 localhost ceph-osd[31633]: osd.2 15 crush map has features 3314933000852226048, adjusting msgr requires for osds Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 28.360 iops: 7260.109 elapsed_sec: 0.413 Feb 23 02:39:10 localhost ceph-osd[32575]: log_channel(cluster) log [WRN] : OSD bench result of 7260.108975 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.5. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 0 waiting for initial osdmap Feb 23 02:39:10 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5[32571]: 2026-02-23T07:39:10.372+0000 7f8bc4a8b640 -1 osd.5 0 waiting for initial osdmap Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 16 crush map has features 288514051259236352, adjusting msgr requires for clients Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 16 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 16 crush map has features 3314933000852226048, adjusting msgr requires for osds Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 16 check_osdmap_features require_osd_release unknown -> reef Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 23 02:39:10 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-osd-5[32571]: 2026-02-23T07:39:10.394+0000 7f8bbf8a0640 -1 osd.5 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 16 set_numa_affinity not setting numa affinity Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 16 _collect_metadata loop4: no unique device id for loop4: fallback method has no model nor serial Feb 23 02:39:10 localhost podman[33495]: Feb 23 02:39:10 localhost podman[33495]: 2026-02-23 07:39:10.487401961 +0000 UTC m=+0.060927168 container create fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, name=rhceph, CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2) Feb 23 02:39:10 localhost systemd[1]: Started libpod-conmon-fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5.scope. Feb 23 02:39:10 localhost systemd[1]: Started libcrun container. Feb 23 02:39:10 localhost podman[33495]: 2026-02-23 07:39:10.458040921 +0000 UTC m=+0.031566178 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:39:10 localhost podman[33495]: 2026-02-23 07:39:10.559840661 +0000 UTC m=+0.133365868 container init fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, name=rhceph, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, build-date=2026-02-09T10:25:24Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 02:39:10 localhost podman[33495]: 2026-02-23 07:39:10.570899441 +0000 UTC m=+0.144424648 container start fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main) Feb 23 02:39:10 localhost podman[33495]: 2026-02-23 07:39:10.571392977 +0000 UTC m=+0.144918214 container attach fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, RELEASE=main, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , ceph=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.42.2) Feb 23 02:39:10 localhost busy_dewdney[33510]: 167 167 Feb 23 02:39:10 localhost systemd[1]: libpod-fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5.scope: Deactivated successfully. Feb 23 02:39:10 localhost podman[33495]: 2026-02-23 07:39:10.575042448 +0000 UTC m=+0.148567705 container died fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 02:39:10 localhost systemd[1]: var-lib-containers-storage-overlay-f3a00d67f949491969554252455383321180dec115fef8603cccfb97e1e81e82-merged.mount: Deactivated successfully. Feb 23 02:39:10 localhost podman[33515]: 2026-02-23 07:39:10.663686288 +0000 UTC m=+0.079257648 container remove fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_dewdney, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7) Feb 23 02:39:10 localhost systemd[1]: libpod-conmon-fd486a448af1d0c901efcb6325cf0fa9c13b6eb53cf3e0f94d36f53135d2eeb5.scope: Deactivated successfully. Feb 23 02:39:10 localhost ceph-osd[32575]: osd.5 17 state: booting -> active Feb 23 02:39:10 localhost podman[33534]: Feb 23 02:39:10 localhost podman[33534]: 2026-02-23 07:39:10.870712849 +0000 UTC m=+0.071303482 container create c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, RELEASE=main, release=1770267347, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 02:39:10 localhost systemd[1]: Started libpod-conmon-c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943.scope. Feb 23 02:39:10 localhost systemd[1]: Started libcrun container. Feb 23 02:39:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ff7c36fe8ea2aa868693fd5fa630c6d71881a5a14da6fa3f1ad5e023647a557/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:10 localhost podman[33534]: 2026-02-23 07:39:10.842078397 +0000 UTC m=+0.042669470 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 02:39:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ff7c36fe8ea2aa868693fd5fa630c6d71881a5a14da6fa3f1ad5e023647a557/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ff7c36fe8ea2aa868693fd5fa630c6d71881a5a14da6fa3f1ad5e023647a557/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 02:39:10 localhost podman[33534]: 2026-02-23 07:39:10.970684853 +0000 UTC m=+0.171275486 container init c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.component=rhceph-container, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 02:39:10 localhost podman[33534]: 2026-02-23 07:39:10.981480839 +0000 UTC m=+0.182071482 container start c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, RELEASE=main, architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 02:39:10 localhost podman[33534]: 2026-02-23 07:39:10.981765815 +0000 UTC m=+0.182356448 container attach c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, ceph=True, RELEASE=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container) Feb 23 02:39:11 localhost elated_proskuriakova[33549]: [ Feb 23 02:39:11 localhost elated_proskuriakova[33549]: { Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "available": false, Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "ceph_device": false, Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "lsm_data": {}, Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "lvs": [], Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "path": "/dev/sr0", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "rejected_reasons": [ Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "Insufficient space (<5GB)", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "Has a FileSystem" Feb 23 02:39:11 localhost elated_proskuriakova[33549]: ], Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "sys_api": { Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "actuators": null, Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "device_nodes": "sr0", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "human_readable_size": "482.00 KB", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "id_bus": "ata", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "model": "QEMU DVD-ROM", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "nr_requests": "2", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "partitions": {}, Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "path": "/dev/sr0", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "removable": "1", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "rev": "2.5+", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "ro": "0", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "rotational": "1", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "sas_address": "", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "sas_device_handle": "", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "scheduler_mode": "mq-deadline", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "sectors": 0, Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "sectorsize": "2048", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "size": 493568.0, Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "support_discard": "0", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "type": "disk", Feb 23 02:39:11 localhost elated_proskuriakova[33549]: "vendor": "QEMU" Feb 23 02:39:11 localhost elated_proskuriakova[33549]: } Feb 23 02:39:11 localhost elated_proskuriakova[33549]: } Feb 23 02:39:11 localhost elated_proskuriakova[33549]: ] Feb 23 02:39:11 localhost systemd[1]: libpod-c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943.scope: Deactivated successfully. Feb 23 02:39:11 localhost podman[33534]: 2026-02-23 07:39:11.80976196 +0000 UTC m=+1.010352603 container died c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_BRANCH=main, maintainer=Guillaume Abrioux , release=1770267347, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 02:39:11 localhost systemd[1]: tmp-crun.jTKsZx.mount: Deactivated successfully. Feb 23 02:39:11 localhost systemd[1]: var-lib-containers-storage-overlay-2ff7c36fe8ea2aa868693fd5fa630c6d71881a5a14da6fa3f1ad5e023647a557-merged.mount: Deactivated successfully. Feb 23 02:39:11 localhost podman[34766]: 2026-02-23 07:39:11.897433288 +0000 UTC m=+0.077465594 container remove c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elated_proskuriakova, RELEASE=main, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, architecture=x86_64) Feb 23 02:39:11 localhost systemd[1]: libpod-conmon-c30dd87d971428f3344c194f9916593dd0504efce4b5d45abd13ae49956d0943.scope: Deactivated successfully. Feb 23 02:39:12 localhost ceph-osd[32575]: osd.5 pg_epoch: 17 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=17) [3,5,4] r=1 lpr=17 pi=[15,17)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:39:15 localhost sshd[34793]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:39:20 localhost podman[34895]: 2026-02-23 07:39:20.987046892 +0000 UTC m=+0.080644251 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_CLEAN=True, vcs-type=git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux ) Feb 23 02:39:21 localhost podman[34895]: 2026-02-23 07:39:21.093222486 +0000 UTC m=+0.186819915 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1770267347) Feb 23 02:39:58 localhost sshd[34978]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:40:22 localhost systemd[1]: tmp-crun.16wQsG.mount: Deactivated successfully. Feb 23 02:40:22 localhost podman[35081]: 2026-02-23 07:40:22.936717138 +0000 UTC m=+0.092625787 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_CLEAN=True, io.buildah.version=1.42.2, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.) Feb 23 02:40:23 localhost podman[35081]: 2026-02-23 07:40:23.050702434 +0000 UTC m=+0.206611083 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, version=7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 02:40:27 localhost systemd[1]: session-13.scope: Deactivated successfully. Feb 23 02:40:27 localhost systemd[1]: session-13.scope: Consumed 20.999s CPU time. Feb 23 02:40:27 localhost systemd-logind[759]: Session 13 logged out. Waiting for processes to exit. Feb 23 02:40:27 localhost systemd-logind[759]: Removed session 13. Feb 23 02:40:41 localhost sshd[35222]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:41:24 localhost sshd[35224]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:42:08 localhost sshd[35304]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:42:47 localhost systemd[26179]: Created slice User Background Tasks Slice. Feb 23 02:42:47 localhost systemd[26179]: Starting Cleanup of User's Temporary Files and Directories... Feb 23 02:42:47 localhost systemd[26179]: Finished Cleanup of User's Temporary Files and Directories. Feb 23 02:42:54 localhost sshd[35383]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:42:57 localhost sshd[35385]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:42:58 localhost sshd[35386]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:43:37 localhost sshd[35464]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:43:55 localhost sshd[35466]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:43:55 localhost systemd-logind[759]: New session 27 of user zuul. Feb 23 02:43:56 localhost systemd[1]: Started Session 27 of User zuul. Feb 23 02:43:56 localhost python3[35514]: ansible-ansible.legacy.ping Invoked with data=pong Feb 23 02:43:57 localhost python3[35559]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 02:43:57 localhost python3[35579]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 23 02:43:58 localhost python3[35636]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:43:58 localhost python3[35679]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771832638.0528462-66849-101500868619506/source _original_basename=tmp7f398_j3 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:43:59 localhost python3[35709]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:43:59 localhost python3[35725]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:43:59 localhost python3[35741]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:44:00 localhost python3[35757]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCaFiAv9bTisS17GN1FZ7h/VJaLu2YTZdiVc9K45WqX/JhZ8Pwx1BXqBJEGlK+qmwEqEak4GGxIE3mEhiQwiY15E3nfJntSoOu8LhTWltcLr5Uy2mx8nXUZwB3QoNiY4y5uXCriTu6HLCXbdwhnWWT8PgX7V31GosHI0JQEpo4ixoShAJC2sFXL7wd3dTaZBo73qVrUhmekv/2GJo179k6wGblVjOwgB9nkfcuo0acLUaot1Uhc7ZrZM3Nfa7bjrW7OigrLvtNra7bBsjfeTgu6vOxxy1DcTD1xBKab631zTIIugiPViMTGrxgsBpyOc4tJpl1grZCJxm8mBDN25oK2jvP/NwUcC3C9ASWEr9U4QiAOTYN03OAyLGbhq48W3SYHwIi8/awDfJapvA+5rlCq/Xb+Fi/KrAaPxVoEehqWLBzCv0u/ZLZarmOih5rcgNTLMQ3l1/nVNtx+VP3eLtqA5a71JqntFfS80adH/3Px+Wen0lIixRqguGNzrD8ZGcU= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:44:01 localhost python3[35771]: ansible-ping Invoked with data=pong Feb 23 02:44:03 localhost sshd[35772]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:44:12 localhost sshd[35774]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:44:12 localhost systemd-logind[759]: New session 28 of user tripleo-admin. Feb 23 02:44:12 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 23 02:44:12 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 23 02:44:12 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 23 02:44:12 localhost systemd[1]: Starting User Manager for UID 1003... Feb 23 02:44:12 localhost systemd[35778]: Queued start job for default target Main User Target. Feb 23 02:44:12 localhost systemd[35778]: Created slice User Application Slice. Feb 23 02:44:12 localhost systemd[35778]: Started Mark boot as successful after the user session has run 2 minutes. Feb 23 02:44:12 localhost systemd[35778]: Started Daily Cleanup of User's Temporary Directories. Feb 23 02:44:12 localhost systemd[35778]: Reached target Paths. Feb 23 02:44:12 localhost systemd[35778]: Reached target Timers. Feb 23 02:44:12 localhost systemd[35778]: Starting D-Bus User Message Bus Socket... Feb 23 02:44:12 localhost systemd[35778]: Starting Create User's Volatile Files and Directories... Feb 23 02:44:12 localhost systemd[35778]: Finished Create User's Volatile Files and Directories. Feb 23 02:44:12 localhost systemd[35778]: Listening on D-Bus User Message Bus Socket. Feb 23 02:44:12 localhost systemd[35778]: Reached target Sockets. Feb 23 02:44:12 localhost systemd[35778]: Reached target Basic System. Feb 23 02:44:12 localhost systemd[35778]: Reached target Main User Target. Feb 23 02:44:12 localhost systemd[35778]: Startup finished in 119ms. Feb 23 02:44:12 localhost systemd[1]: Started User Manager for UID 1003. Feb 23 02:44:12 localhost systemd[1]: Started Session 28 of User tripleo-admin. Feb 23 02:44:13 localhost python3[35839]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Feb 23 02:44:18 localhost python3[35859]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Feb 23 02:44:19 localhost python3[35875]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Feb 23 02:44:19 localhost python3[35923]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.2_n_ngzjtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:44:19 localhost sshd[35938]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:44:20 localhost python3[35955]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.2_n_ngzjtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:44:21 localhost python3[35971]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.2_n_ngzjtmphosts insertbefore=BOF block=172.17.0.106 np0005626463.localdomain np0005626463#012172.18.0.106 np0005626463.storage.localdomain np0005626463.storage#012172.20.0.106 np0005626463.storagemgmt.localdomain np0005626463.storagemgmt#012172.17.0.106 np0005626463.internalapi.localdomain np0005626463.internalapi#012172.19.0.106 np0005626463.tenant.localdomain np0005626463.tenant#012192.168.122.106 np0005626463.ctlplane.localdomain np0005626463.ctlplane#012172.17.0.107 np0005626465.localdomain np0005626465#012172.18.0.107 np0005626465.storage.localdomain np0005626465.storage#012172.20.0.107 np0005626465.storagemgmt.localdomain np0005626465.storagemgmt#012172.17.0.107 np0005626465.internalapi.localdomain np0005626465.internalapi#012172.19.0.107 np0005626465.tenant.localdomain np0005626465.tenant#012192.168.122.107 np0005626465.ctlplane.localdomain np0005626465.ctlplane#012172.17.0.108 np0005626466.localdomain np0005626466#012172.18.0.108 np0005626466.storage.localdomain np0005626466.storage#012172.20.0.108 np0005626466.storagemgmt.localdomain np0005626466.storagemgmt#012172.17.0.108 np0005626466.internalapi.localdomain np0005626466.internalapi#012172.19.0.108 np0005626466.tenant.localdomain np0005626466.tenant#012192.168.122.108 np0005626466.ctlplane.localdomain np0005626466.ctlplane#012172.17.0.103 np0005626459.localdomain np0005626459#012172.18.0.103 np0005626459.storage.localdomain np0005626459.storage#012172.20.0.103 np0005626459.storagemgmt.localdomain np0005626459.storagemgmt#012172.17.0.103 np0005626459.internalapi.localdomain np0005626459.internalapi#012172.19.0.103 np0005626459.tenant.localdomain np0005626459.tenant#012192.168.122.103 np0005626459.ctlplane.localdomain np0005626459.ctlplane#012172.17.0.104 np0005626460.localdomain np0005626460#012172.18.0.104 np0005626460.storage.localdomain np0005626460.storage#012172.20.0.104 np0005626460.storagemgmt.localdomain np0005626460.storagemgmt#012172.17.0.104 np0005626460.internalapi.localdomain np0005626460.internalapi#012172.19.0.104 np0005626460.tenant.localdomain np0005626460.tenant#012192.168.122.104 np0005626460.ctlplane.localdomain np0005626460.ctlplane#012172.17.0.105 np0005626461.localdomain np0005626461#012172.18.0.105 np0005626461.storage.localdomain np0005626461.storage#012172.20.0.105 np0005626461.storagemgmt.localdomain np0005626461.storagemgmt#012172.17.0.105 np0005626461.internalapi.localdomain np0005626461.internalapi#012172.19.0.105 np0005626461.tenant.localdomain np0005626461.tenant#012192.168.122.105 np0005626461.ctlplane.localdomain np0005626461.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.localdomain#012172.18.0.134 overcloud.storage.localdomain#012172.20.0.172 overcloud.storagemgmt.localdomain#012172.17.0.129 overcloud.internalapi.localdomain#012172.21.0.176 overcloud.localdomain#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:44:21 localhost python3[35987]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.2_n_ngzjtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:44:22 localhost python3[36004]: ansible-file Invoked with path=/tmp/ansible.2_n_ngzjtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:44:23 localhost python3[36020]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:44:23 localhost python3[36037]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:44:28 localhost python3[36056]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:44:29 localhost python3[36073]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:44:57 localhost sshd[37119]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:45:02 localhost sshd[37143]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:45:02 localhost sshd[37149]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:45:35 localhost kernel: SELinux: Converting 2700 SID table entries... Feb 23 02:45:35 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 02:45:35 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 02:45:35 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 02:45:35 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 02:45:35 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 02:45:35 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 02:45:35 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 02:45:35 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=5 res=1 Feb 23 02:45:36 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=6 res=1 Feb 23 02:45:36 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 02:45:36 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 02:45:36 localhost systemd[1]: Reloading. Feb 23 02:45:36 localhost systemd-rc-local-generator[37461]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:45:36 localhost systemd-sysv-generator[37465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:45:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:45:36 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 02:45:36 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 02:45:36 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 02:45:36 localhost systemd[1]: run-rbec30ba95a1043c08d2e4681aac6c730.service: Deactivated successfully. Feb 23 02:45:39 localhost python3[37934]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:45:40 localhost python3[38073]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:45:40 localhost systemd[1]: Reloading. Feb 23 02:45:40 localhost systemd-sysv-generator[38101]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:45:40 localhost systemd-rc-local-generator[38098]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:45:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:45:42 localhost python3[38127]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:45:42 localhost python3[38143]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:45:43 localhost python3[38160]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 23 02:45:44 localhost python3[38178]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:45:44 localhost python3[38196]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:45:45 localhost python3[38214]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 02:45:45 localhost systemd[1]: Reloading Network Manager... Feb 23 02:45:45 localhost NetworkManager[5974]: [1771832745.1935] audit: op="reload" arg="0" pid=38217 uid=0 result="success" Feb 23 02:45:45 localhost NetworkManager[5974]: [1771832745.1943] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Feb 23 02:45:45 localhost NetworkManager[5974]: [1771832745.1944] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Feb 23 02:45:45 localhost systemd[1]: Reloaded Network Manager. Feb 23 02:45:45 localhost sshd[38228]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:45:45 localhost python3[38235]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:45:46 localhost python3[38252]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:45:46 localhost python3[38270]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:45:46 localhost python3[38286]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:45:47 localhost python3[38302]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Feb 23 02:45:48 localhost python3[38318]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:45:49 localhost python3[38334]: ansible-blockinfile Invoked with path=/tmp/ansible.vetg1p3y block=[192.168.122.106]*,[np0005626463.ctlplane.localdomain]*,[172.17.0.106]*,[np0005626463.internalapi.localdomain]*,[172.18.0.106]*,[np0005626463.storage.localdomain]*,[172.20.0.106]*,[np0005626463.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005626463.tenant.localdomain]*,[np0005626463.localdomain]*,[np0005626463]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/Caj4zYKd24ctvaRU1Hf9nT058OF4bRnDJ3bHimmkyIL7cccXAxo3lx50wZHWRYBhF5Wes6TmqnUTTK1h5wVdI8f7YtQ9IyMIlfoEiTThF5PgODVuRYq+YGjFIy7MTPyBnB2428aT4dlYqHSuxK2gL6ALlCJHNyeh3RW3jCOG89veDoRmbqHGoaD+xPRnfsdHLoLFNfxT4UJiKRuqsEd5fNtc392ROSa5XM3PPIs3YTypYmpfFHs1B1j+y6oZV8Ha/QXqURpI7/aJmfnDzXLMsLWp4GRpkwzljvNp87S5HL+kJMo79n0Vmh2JdN1orNP/4A2t/TENckHbrZCm+YmPqUqvpHkAZfFfmvP62YZTPq/qOjBMMq6ulGSHd2I4XfE7NNZRKoS3G4HVlBb0ONS13PaWx9rrJCRlF64L1dHSt9zpKrvRbWkSdXA0PwwehrU5/OBo1IY4WsRlWmPeET1/dFWiIr1t9uGjp5vmACAx7rnC6G5qSEhQ3/k1Wa57k/k=#012[192.168.122.107]*,[np0005626465.ctlplane.localdomain]*,[172.17.0.107]*,[np0005626465.internalapi.localdomain]*,[172.18.0.107]*,[np0005626465.storage.localdomain]*,[172.20.0.107]*,[np0005626465.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005626465.tenant.localdomain]*,[np0005626465.localdomain]*,[np0005626465]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCUc8l2oYgfdO7xb3vN27co3Q/sFNU6Rw5wThiW1JMfeIzI90ZzS/L+BpsDsX8q2CW9QOHXrbUormpGsiNnix5j1P29Jc6e9A2mDlipXBrFSUiVZa8UOL03lFSz4nElapkASin2GCdHqy7//gGdQMKRP62VXpdhofb7i/N/gGoV5hSc8Q36KFDbWpvPkhD5H8nZtAfyxM99KwlC62D8jSN+gdoRtMRFPQTtyvyskyrgnXGC6xV71WTa6LJ6Meo7tfj4JlvDAWwlD+f9Ruu2ty2aHd2feVVKYvxZ4Z45iSfJnNxRFJvu1QOY0IU4Fj942leKwr6f0B5ogPFlTI7wRrAB1d9tri1WW2aL1AqYhdZscWi0VArYxLQr7BCVqz8KgFIzjbPoJ7uYnWcuDSiWlC1NJVO7Ij2natf8wZyvSyH+vydamkyoaNwxMnm4qs0/rvjwL49MdrHB79rXjHYJpt/JCBvn9a/rh5KqVH40P00DP35H71zyHPCSu1L20S/wY1k=#012[192.168.122.108]*,[np0005626466.ctlplane.localdomain]*,[172.17.0.108]*,[np0005626466.internalapi.localdomain]*,[172.18.0.108]*,[np0005626466.storage.localdomain]*,[172.20.0.108]*,[np0005626466.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005626466.tenant.localdomain]*,[np0005626466.localdomain]*,[np0005626466]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD4dg5LfbOyIHJudQjfDyIcqYXRqMUeYQIpjQPmNS0Tl7/EpBaYixjqlNovKIWOwkS4E2n4hwPLSTGSihYb5BeUDw32T80RumycS2tjBCSLiuq93xpTOaL2X+7wykkOSfY5xya13qrTg0ROJip0B6PSSF+Rn28SAKLh91euCdRaxWTAMeOSTP9WeCA3d0gsgb4xSMMWZxR4o1BU2bixjAcJHAlKYDc1OGpKkirRoziu9Y4nq2lmbwTg5HiS8STVkqyGHba9k6IC0eF2ZmT6M2thoHatYVtjuUeEE9bSvaAFB8oSI9Np6+OaluvuoKJYjRA3dzEQOi4ft/wwUrJfvyypDAxKBkxo7lCWIDEBK5Zb9BVoo68psz2IVPNGNZJtKXiq58CAqZTR02l/wEq4wB1/hp7ZW+ZMnHQUq1FpGITIA89KZeL9xNlnHqYak58B2GCYgK6OdvWktr4WHN8nbEmwZvaTrijZvnww7h2FQG4BMcSlO6AWKAdjksJZlVDYLJs=#012[192.168.122.103]*,[np0005626459.ctlplane.localdomain]*,[172.17.0.103]*,[np0005626459.internalapi.localdomain]*,[172.18.0.103]*,[np0005626459.storage.localdomain]*,[172.20.0.103]*,[np0005626459.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005626459.tenant.localdomain]*,[np0005626459.localdomain]*,[np0005626459]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9VsrIfV6Z4AiMtHfmjOpcBCt5sMsGmP0fOSak1UBP4r9lW4eYyoJY7Rtt1LDAcbGqdL3Nh3yc8ub0ekpXF6MA0vKucLb+jtjexv6t21W2grJ+ucwsvDhTDhDXmOUwD5G7A9Zj2WDqt/DN4DxeEqvQ6v1dSQaG+17BVPvM7mhgd5CSYOdUphCC81TPZgj3xyK31Q89biIS6pCBSKnsyN7qcU38bFGvRN0sTFaFt9KrIUfJJdcAZudw5Q/R775pmaaeHTSVPL05gE7dyz8RicEpenh6X0aZCOVt0+4VBnfXXSIL9QIwjrarPPKRdtmQY7dZ3dVNI1ZWA5YOl0y6R3fmxaRV5y1ZkDW6vG0463hYjKaAVqILAAPZGzhuzL7/1zxIv0guUB58tOUrCkkPIRzd6NQLL2j8L7RLIj3bZjG2xf0WiierxPsCEhl3wmdIVRUReE6jYalNGlscGUr1JWproKoaQqfck0OWhGy7jCCe8Gd8a/pr7jtg+X3bEMQ3HAc=#012[192.168.122.104]*,[np0005626460.ctlplane.localdomain]*,[172.17.0.104]*,[np0005626460.internalapi.localdomain]*,[172.18.0.104]*,[np0005626460.storage.localdomain]*,[172.20.0.104]*,[np0005626460.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005626460.tenant.localdomain]*,[np0005626460.localdomain]*,[np0005626460]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCeQmwl5IUCA7h6xphf+o3WARi0Xlj+0K08ltN/FCX7iF0EALCfDqtKOHz7wv5gS04Zx4aeNfcVHv9bHLRJxTPzliSNVutqA7vdFa0R/kRMdNzkqSOCuJ64sQ8GwSOHSrcFy7qC87BuP6xB9atSBjpAEB4NZOuXbvmSN/dCa/nNpUWoWNNg3eR5AalrExCptFYZ4E7YWvJ6HdZpr1QhcAJW0V1y4+u4FfzxHT2SQfGmua4TFHH1lUMiMrgAoELLe+pYdnWooEhRlkPulWy/wOyNz7aCCDP462XBhCc0CmiBDRwMBaJISck1pJCOIksvu8TYa6Fp8aayZqJvbUJYl5C1Z/o+zgHMTjeec0Th5GIuw9XUJkkx8TT5Fh7aWJvX9BbHlMaJjAqc+G/wiIImvKlsuIsovU6TH0P/XiysoWXeUWM7JqR8Y/05+yELy+xAMKT7PfEXE1fWOlGcCJsarLYGhh/7Jypwfh8Y/wOtYdKOGODxDnzq2f2VySsEiAf0EL0=#012[192.168.122.105]*,[np0005626461.ctlplane.localdomain]*,[172.17.0.105]*,[np0005626461.internalapi.localdomain]*,[172.18.0.105]*,[np0005626461.storage.localdomain]*,[172.20.0.105]*,[np0005626461.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005626461.tenant.localdomain]*,[np0005626461.localdomain]*,[np0005626461]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDBCzU24t9gA5R+exm4rHJ2VytHuq8uUoKuu6SZ07dskKR77n7TwlsZhsDjpzwsddHd+lvsfvOVmolxjJsCmq7LJRMGA/mczHXsGGb43YPZPKsiJ6KMPDORy5/ihhnqixBYVmBGtdPu/Hh/udGnymZgR/RYGltDDHoCfGGiEcHJSIuf/Bv2Uv4xFnxFjDrWQFrkJ5Grq1xC7cGXgC3gAiTCjGHkG9rb/oyTUjjM8LaaRYIjeoDQZu1/8y5pl6cnhW21VTA+u55SkSimb/g5oOuSmrv899iHFwb54uLINXvA4aTtduUnxNQBVRyFvWa3yCZXVJeYlcVP8Q9tljn9anN1aISnS311Jmay6zUY927bxnzrpkwaV7Ggwtvi6vlVy84ZvOJ/IJ2boDiMujh1ZpT3bxXG3Oy0EjfBVbpkS6r2MbGTPj/xWnosJ6JNVbb9LW7Ftfi3/NFfAb7PpTgY036DA8LYoYIfqxVJUhlo5fJjqqOLa/zbvZVwrFCG+Zm160=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:45:49 localhost python3[38350]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.vetg1p3y' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:45:49 localhost python3[38368]: ansible-file Invoked with path=/tmp/ansible.vetg1p3y state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:45:50 localhost python3[38384]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 02:45:50 localhost python3[38400]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:45:51 localhost python3[38418]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:45:51 localhost python3[38437]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Feb 23 02:45:54 localhost python3[38574]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:45:54 localhost python3[38591]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:45:57 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 23 02:45:57 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 23 02:45:58 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 02:45:58 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 02:45:58 localhost systemd[1]: Reloading. Feb 23 02:45:58 localhost systemd-rc-local-generator[38680]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:45:58 localhost systemd-sysv-generator[38684]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:45:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:45:58 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 02:45:58 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 23 02:45:58 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 23 02:45:58 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 23 02:45:58 localhost systemd[1]: tuned.service: Consumed 1.572s CPU time. Feb 23 02:45:58 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 23 02:45:58 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 02:45:58 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 02:45:58 localhost systemd[1]: run-r3c66311e429d44ecb657594e1d396238.service: Deactivated successfully. Feb 23 02:45:59 localhost sshd[39003]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:45:59 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 23 02:45:59 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 02:45:59 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 02:46:00 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 02:46:00 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 02:46:00 localhost systemd[1]: run-rb6d2207b3fb54b0dbea8f6d4ee22507c.service: Deactivated successfully. Feb 23 02:46:01 localhost python3[39029]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:46:01 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 23 02:46:01 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 23 02:46:01 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 23 02:46:01 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 23 02:46:02 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 23 02:46:03 localhost python3[39224]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:03 localhost python3[39241]: ansible-slurp Invoked with src=/etc/tuned/active_profile Feb 23 02:46:04 localhost python3[39257]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:46:04 localhost python3[39273]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:06 localhost python3[39293]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:07 localhost python3[39310]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:46:09 localhost python3[39326]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:15 localhost python3[39342]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:15 localhost python3[39390]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:16 localhost python3[39435]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832775.2933772-71384-262129182644031/source _original_basename=tmpbu91ngux follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:16 localhost python3[39465]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:17 localhost python3[39513]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:17 localhost python3[39556]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832776.945125-71480-188798672625667/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=cb4e2d65c3f4c3faf38650c4c339d73dfcec347e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:18 localhost python3[39618]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:18 localhost python3[39661]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832777.899731-71543-22303193468700/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=e3816c2e211db94b1efb9354b78e4bda87216798 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:19 localhost python3[39723]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:19 localhost python3[39766]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832778.780203-71543-268509960303819/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=426c74ff16c690bcb458d5adf7a90df54cf7398a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:20 localhost python3[39828]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:20 localhost python3[39871]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832779.7350893-71543-17093954431605/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=68b5a56a66cb10764ef3288009ad5e9b7e8faf12 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:21 localhost python3[39933]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:21 localhost python3[39976]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832780.8098772-71543-242073356606177/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:22 localhost python3[40038]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:22 localhost python3[40081]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832781.6953962-71543-233202908316202/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=7b67a93cb6155d994227cb0fb8cb85d0abcca135 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:22 localhost python3[40143]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:23 localhost python3[40186]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832782.560317-71543-240669836635481/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:23 localhost python3[40248]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:24 localhost python3[40291]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832783.4746933-71543-180116458338279/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=4e4e677ff4d1886f9c2ad18567185be59ce1ed84 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:24 localhost python3[40353]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:25 localhost python3[40396]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832784.3707433-71543-210944442131248/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:25 localhost python3[40458]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:25 localhost sshd[40472]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:46:25 localhost python3[40503]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832785.2519917-71543-76207560337378/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:26 localhost python3[40565]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:26 localhost python3[40608]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832786.142713-71543-22452970220595/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=6914e83d930180efd1febf3d10b0106910a745c1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:27 localhost python3[40638]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:46:28 localhost python3[40686]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:46:28 localhost python3[40729]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832788.02342-72366-192368920528160/source _original_basename=tmp0wffb24f follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:46:33 localhost python3[40759]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 02:46:33 localhost python3[40820]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:38 localhost python3[40889]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:43 localhost python3[40932]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:44 localhost python3[40955]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:44 localhost python3[40978]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:45 localhost python3[41001]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:46 localhost python3[41024]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:46:47 localhost systemd[35778]: Starting Mark boot as successful... Feb 23 02:46:47 localhost systemd[35778]: Finished Mark boot as successful. Feb 23 02:46:52 localhost sshd[41033]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:47:08 localhost sshd[41035]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:47:27 localhost python3[41052]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:28 localhost python3[41100]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:28 localhost python3[41118]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpvgyu3l0p recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:28 localhost python3[41148]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:29 localhost python3[41196]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:29 localhost python3[41214]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:30 localhost python3[41276]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:30 localhost python3[41294]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:31 localhost python3[41356]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:31 localhost python3[41374]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:32 localhost python3[41436]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:32 localhost python3[41454]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:33 localhost python3[41516]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:33 localhost python3[41534]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:33 localhost python3[41596]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:34 localhost python3[41614]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:34 localhost python3[41676]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:34 localhost python3[41694]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:35 localhost python3[41756]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:35 localhost python3[41774]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:35 localhost python3[41836]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:36 localhost python3[41854]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:37 localhost python3[41916]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:37 localhost python3[41934]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:37 localhost python3[41996]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:38 localhost python3[42014]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:38 localhost python3[42044]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:47:39 localhost python3[42122]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:39 localhost python3[42156]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpzcj0ua9b recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:42 localhost python3[42218]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:47:46 localhost python3[42235]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:47:47 localhost python3[42253]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:47:48 localhost python3[42271]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:47:48 localhost systemd[1]: Reloading. Feb 23 02:47:48 localhost systemd-rc-local-generator[42298]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:47:49 localhost systemd-sysv-generator[42302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:47:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:47:49 localhost systemd[1]: Starting Netfilter Tables... Feb 23 02:47:49 localhost systemd[1]: Finished Netfilter Tables. Feb 23 02:47:49 localhost python3[42361]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:50 localhost python3[42404]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832869.5947292-75156-190287893829712/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:50 localhost python3[42434]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:47:51 localhost python3[42452]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:47:51 localhost sshd[42478]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:47:51 localhost sshd[42488]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:47:52 localhost python3[42504]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:52 localhost python3[42548]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832871.8704898-75283-169528438329383/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:53 localhost python3[42610]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:53 localhost python3[42653]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832873.1378508-75360-151464270445541/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:54 localhost python3[42715]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:54 localhost python3[42758]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832874.1292903-75423-111902914142600/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:55 localhost python3[42820]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:56 localhost python3[42863]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832875.4343626-75719-78158252116706/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:57 localhost python3[42925]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:47:57 localhost python3[42968]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832876.7569578-75771-59745741056343/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:58 localhost python3[42998]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:47:59 localhost python3[43063]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:47:59 localhost python3[43080]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:47:59 localhost python3[43097]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:48:00 localhost python3[43116]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:00 localhost python3[43132]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:01 localhost python3[43148]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:01 localhost python3[43164]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 23 02:48:02 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=7 res=1 Feb 23 02:48:02 localhost python3[43184]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 23 02:48:03 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 23 02:48:03 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 02:48:03 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 02:48:03 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 02:48:03 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 02:48:03 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 02:48:03 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 02:48:03 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 02:48:03 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=8 res=1 Feb 23 02:48:03 localhost python3[43205]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 23 02:48:04 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 23 02:48:04 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 02:48:04 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 02:48:04 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 02:48:04 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 02:48:04 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 02:48:04 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 02:48:04 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 02:48:04 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=9 res=1 Feb 23 02:48:05 localhost python3[43226]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 23 02:48:05 localhost kernel: SELinux: Converting 2704 SID table entries... Feb 23 02:48:05 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 02:48:05 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 02:48:05 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 02:48:05 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 02:48:05 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 02:48:05 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 02:48:05 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 02:48:06 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=10 res=1 Feb 23 02:48:06 localhost python3[43247]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:06 localhost python3[43263]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:07 localhost python3[43279]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:07 localhost python3[43295]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:48:07 localhost python3[43311]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:48:08 localhost python3[43328]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:48:12 localhost python3[43345]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:12 localhost python3[43393]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:13 localhost python3[43436]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832892.3124158-76451-162393161049223/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:13 localhost python3[43466]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 02:48:13 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 23 02:48:13 localhost systemd[1]: Stopped Load Kernel Modules. Feb 23 02:48:13 localhost systemd[1]: Stopping Load Kernel Modules... Feb 23 02:48:13 localhost systemd[1]: Starting Load Kernel Modules... Feb 23 02:48:13 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 23 02:48:13 localhost kernel: Bridge firewalling registered Feb 23 02:48:13 localhost systemd-modules-load[43469]: Inserted module 'br_netfilter' Feb 23 02:48:13 localhost systemd-modules-load[43469]: Module 'msr' is built in Feb 23 02:48:13 localhost systemd[1]: Finished Load Kernel Modules. Feb 23 02:48:14 localhost python3[43520]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:14 localhost python3[43563]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832893.7593112-76550-127741251738842/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:14 localhost python3[43593]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:15 localhost python3[43610]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:16 localhost python3[43628]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:16 localhost python3[43646]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:17 localhost python3[43663]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:17 localhost python3[43680]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:17 localhost python3[43697]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:18 localhost python3[43715]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:18 localhost python3[43733]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:18 localhost python3[43751]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:19 localhost python3[43769]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:19 localhost python3[43787]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:19 localhost python3[43805]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:20 localhost python3[43823]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:20 localhost python3[43840]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:20 localhost python3[43857]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:20 localhost python3[43874]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:21 localhost python3[43891]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Feb 23 02:48:21 localhost python3[43909]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 02:48:21 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 23 02:48:21 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 23 02:48:21 localhost systemd[1]: Stopping Apply Kernel Variables... Feb 23 02:48:21 localhost systemd[1]: Starting Apply Kernel Variables... Feb 23 02:48:21 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 23 02:48:21 localhost systemd[1]: Finished Apply Kernel Variables. Feb 23 02:48:22 localhost python3[43929]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:22 localhost python3[43945]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:22 localhost python3[43961]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:23 localhost python3[43977]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:48:23 localhost python3[43993]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:23 localhost python3[44009]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:24 localhost python3[44025]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:24 localhost python3[44041]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:24 localhost python3[44057]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:25 localhost python3[44105]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:25 localhost python3[44148]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832904.8883739-76904-214070854249969/source _original_basename=tmpvoha4s0v follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:25 localhost python3[44178]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:48:27 localhost python3[44195]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:27 localhost python3[44243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:28 localhost python3[44286]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832907.6361003-77346-32685908769013/source _original_basename=tmp2pg00k07 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:28 localhost python3[44316]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:29 localhost python3[44332]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:29 localhost python3[44348]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:29 localhost python3[44364]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:30 localhost python3[44380]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:30 localhost python3[44396]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:30 localhost python3[44412]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:31 localhost python3[44428]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:31 localhost python3[44444]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:31 localhost python3[44460]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Feb 23 02:48:32 localhost python3[44482]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Feb 23 02:48:32 localhost python3[44506]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Feb 23 02:48:32 localhost python3[44522]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:48:33 localhost python3[44571]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:33 localhost python3[44614]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832913.1796858-77560-186325677768155/source _original_basename=tmp6h7dzj6q follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:34 localhost python3[44644]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Feb 23 02:48:35 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=11 res=1 Feb 23 02:48:35 localhost python3[44664]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:35 localhost python3[44680]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:35 localhost python3[44696]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Feb 23 02:48:37 localhost sshd[44704]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:48:37 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=12 res=1 Feb 23 02:48:37 localhost python3[44721]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:48:40 localhost python3[44738]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 02:48:41 localhost python3[44799]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:41 localhost python3[44815]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:48:42 localhost python3[44889]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:42 localhost python3[44948]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832921.9844236-77903-250384013012976/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=553fb9d1f969873d7079b6b23ab84e26a2830710 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:43 localhost python3[45060]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:43 localhost python3[45123]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832922.9251978-77955-206199377258367/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:44 localhost python3[45173]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:44 localhost python3[45198]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:44 localhost python3[45214]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:45 localhost python3[45230]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:45 localhost python3[45278]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:46 localhost python3[45321]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832925.4933586-78133-176510434695991/source _original_basename=tmp3fas637t follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:46 localhost python3[45351]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:47 localhost python3[45367]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:48:47 localhost python3[45383]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:48:50 localhost sshd[45385]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:48:51 localhost python3[45434]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:51 localhost python3[45479]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832930.8503578-78353-208169489117578/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:52 localhost python3[45510]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:48:52 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 23 02:48:52 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 23 02:48:52 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 23 02:48:52 localhost systemd[1]: sshd.service: Consumed 3.936s CPU time, read 1.9M from disk, written 56.0K to disk. Feb 23 02:48:52 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 23 02:48:52 localhost systemd[1]: Stopping sshd-keygen.target... Feb 23 02:48:52 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 02:48:52 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 02:48:52 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 02:48:52 localhost systemd[1]: Reached target sshd-keygen.target. Feb 23 02:48:52 localhost systemd[1]: Starting OpenSSH server daemon... Feb 23 02:48:52 localhost sshd[45514]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:48:52 localhost systemd[1]: Started OpenSSH server daemon. Feb 23 02:48:52 localhost python3[45530]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:48:53 localhost python3[45548]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:48:54 localhost python3[45566]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:48:57 localhost python3[45615]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:57 localhost python3[45633]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:48:58 localhost python3[45663]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:48:59 localhost python3[45713]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:48:59 localhost python3[45731]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:49:00 localhost python3[45761]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:49:00 localhost systemd[1]: Reloading. Feb 23 02:49:00 localhost systemd-rc-local-generator[45783]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:49:00 localhost systemd-sysv-generator[45787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:49:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:49:00 localhost systemd[1]: Starting chronyd online sources service... Feb 23 02:49:00 localhost chronyc[45801]: 200 OK Feb 23 02:49:00 localhost systemd[1]: chrony-online.service: Deactivated successfully. Feb 23 02:49:00 localhost systemd[1]: Finished chronyd online sources service. Feb 23 02:49:00 localhost python3[45817]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:00 localhost chronyd[25974]: System clock was stepped by 0.000003 seconds Feb 23 02:49:01 localhost python3[45834]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 02:49:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3257 writes, 16K keys, 3257 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3257 writes, 144 syncs, 22.62 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3257 writes, 16K keys, 3257 commit groups, 1.0 writes per commit group, ingest: 14.67 MB, 0.02 MB/s#012Interval WAL: 3257 writes, 144 syncs, 22.62 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable Feb 23 02:49:01 localhost python3[45851]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:01 localhost chronyd[25974]: System clock was stepped by 0.000000 seconds Feb 23 02:49:01 localhost python3[45868]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:02 localhost python3[45885]: ansible-timezone Invoked with name=UTC hwclock=None Feb 23 02:49:02 localhost systemd[1]: Starting Time & Date Service... Feb 23 02:49:02 localhost systemd[1]: Started Time & Date Service. Feb 23 02:49:03 localhost python3[45905]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:04 localhost python3[45922]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:04 localhost python3[45939]: ansible-slurp Invoked with src=/etc/tuned/active_profile Feb 23 02:49:05 localhost python3[45955]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:49:05 localhost python3[45971]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:49:06 localhost python3[45987]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:49:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 02:49:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3385 writes, 16K keys, 3385 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s#012Cumulative WAL: 3385 writes, 196 syncs, 17.27 writes per sync, written: 0.01 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3385 writes, 16K keys, 3385 commit groups, 1.0 writes per commit group, ingest: 15.28 MB, 0.03 MB/s#012Interval WAL: 3385 writes, 196 syncs, 17.27 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Feb 23 02:49:06 localhost python3[46035]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:49:06 localhost python3[46078]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832946.1500823-79369-133495529295441/source _original_basename=tmpthq512t7 follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:49:07 localhost python3[46140]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:49:07 localhost python3[46183]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832947.008525-79419-69708615694195/source _original_basename=tmp18s25o1d follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:49:08 localhost python3[46213]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 23 02:49:08 localhost systemd[1]: Reloading. Feb 23 02:49:08 localhost systemd-sysv-generator[46242]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:49:08 localhost systemd-rc-local-generator[46238]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:49:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:49:09 localhost python3[46267]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:49:09 localhost python3[46283]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:09 localhost python3[46300]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:09 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Feb 23 02:49:10 localhost python3[46317]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:49:10 localhost python3[46333]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:49:10 localhost python3[46381]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:49:11 localhost python3[46424]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832950.6138618-79644-257178056254103/source _original_basename=tmpdxa2_iy_ follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:49:23 localhost sshd[46439]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:49:32 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 23 02:49:33 localhost python3[46458]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:49:34 localhost python3[46474]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Feb 23 02:49:34 localhost python3[46490]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:49:34 localhost python3[46506]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:49:35 localhost python3[46522]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:49:35 localhost python3[46538]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Feb 23 02:49:36 localhost kernel: SELinux: Converting 2707 SID table entries... Feb 23 02:49:36 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 02:49:36 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 02:49:36 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 02:49:36 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 02:49:36 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 02:49:36 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 02:49:36 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 02:49:36 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=13 res=1 Feb 23 02:49:36 localhost python3[46559]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:49:39 localhost python3[46696]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': [ Feb 23 02:49:39 localhost rsyslogd[758]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Feb 23 02:49:39 localhost python3[46712]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:49:40 localhost python3[46728]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:49:40 localhost python3[46744]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n -iNONE', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}} Feb 23 02:49:45 localhost python3[46854]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:49:45 localhost python3[46897]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771832985.2449641-81140-78626805301400/source _original_basename=tmp23rxc6sd follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:49:46 localhost python3[46927]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:49:47 localhost systemd[35778]: Created slice User Background Tasks Slice. Feb 23 02:49:47 localhost systemd[35778]: Starting Cleanup of User's Temporary Files and Directories... Feb 23 02:49:47 localhost systemd[35778]: Finished Cleanup of User's Temporary Files and Directories. Feb 23 02:49:48 localhost python3[47066]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:49:49 localhost sshd[47115]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:49:50 localhost python3[47189]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 23 02:49:52 localhost python3[47205]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:49:53 localhost python3[47222]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 02:50:02 localhost dbus-broker-launch[18433]: Noticed file-system modification, trigger reload. Feb 23 02:50:02 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 23 02:50:02 localhost dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Feb 23 02:50:02 localhost dbus-broker-launch[18433]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Feb 23 02:50:02 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 23 02:50:02 localhost systemd[1]: Reexecuting. Feb 23 02:50:02 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Feb 23 02:50:02 localhost systemd[1]: Detected virtualization kvm. Feb 23 02:50:02 localhost systemd[1]: Detected architecture x86-64. Feb 23 02:50:02 localhost systemd-rc-local-generator[47277]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:50:02 localhost systemd-sysv-generator[47281]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:50:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:50:08 localhost sshd[47299]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:50:10 localhost kernel: SELinux: Converting 2707 SID table entries... Feb 23 02:50:10 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 02:50:10 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 02:50:10 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 02:50:10 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 02:50:10 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 02:50:10 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 02:50:10 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 02:50:11 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 23 02:50:11 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=14 res=1 Feb 23 02:50:11 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 23 02:50:12 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 02:50:12 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 02:50:12 localhost systemd[1]: Reloading. Feb 23 02:50:12 localhost systemd-rc-local-generator[47369]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:50:12 localhost systemd-sysv-generator[47375]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:50:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:50:12 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 02:50:12 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 02:50:12 localhost systemd-journald[618]: Journal stopped Feb 23 02:50:12 localhost systemd-journald[618]: Received SIGTERM from PID 1 (systemd). Feb 23 02:50:12 localhost systemd[1]: Stopping Journal Service... Feb 23 02:50:12 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Feb 23 02:50:12 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Feb 23 02:50:12 localhost systemd[1]: Stopped Journal Service. Feb 23 02:50:12 localhost systemd[1]: systemd-journald.service: Consumed 1.766s CPU time. Feb 23 02:50:12 localhost systemd[1]: Starting Journal Service... Feb 23 02:50:12 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 23 02:50:12 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Feb 23 02:50:12 localhost systemd[1]: systemd-udevd.service: Consumed 3.126s CPU time. Feb 23 02:50:12 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Feb 23 02:50:12 localhost systemd-journald[47710]: Journal started Feb 23 02:50:12 localhost systemd-journald[47710]: Runtime Journal (/run/log/journal/c0212a8b024a111cfc61293864f36c87) is 12.2M, max 314.7M, 302.5M free. Feb 23 02:50:12 localhost systemd[1]: Started Journal Service. Feb 23 02:50:12 localhost systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Feb 23 02:50:12 localhost systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 02:50:12 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 02:50:12 localhost systemd-udevd[47718]: Using default interface naming scheme 'rhel-9.0'. Feb 23 02:50:12 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Feb 23 02:50:12 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 02:50:12 localhost systemd[1]: Reloading. Feb 23 02:50:13 localhost systemd-rc-local-generator[48322]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:50:13 localhost systemd-sysv-generator[48331]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:50:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:50:13 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 02:50:13 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 02:50:13 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 02:50:13 localhost systemd[1]: man-db-cache-update.service: Consumed 1.342s CPU time. Feb 23 02:50:13 localhost systemd[1]: run-r1c26139e872348a7b6cc176f2f200207.service: Deactivated successfully. Feb 23 02:50:13 localhost systemd[1]: run-ra2dd25815b9c4efc8a03f822211f16fd.service: Deactivated successfully. Feb 23 02:50:15 localhost python3[48718]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Feb 23 02:50:15 localhost python3[48737]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 02:50:16 localhost python3[48755]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:50:16 localhost python3[48755]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Feb 23 02:50:16 localhost python3[48755]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Feb 23 02:50:23 localhost podman[48768]: 2026-02-23 07:50:16.669682688 +0000 UTC m=+0.039947640 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 23 02:50:23 localhost python3[48755]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 591bb9fb46a70e9f840f28502388406078442df6b6701a3c17990ee75e333673 --format json Feb 23 02:50:24 localhost python3[48870]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:50:24 localhost python3[48870]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Feb 23 02:50:24 localhost python3[48870]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Feb 23 02:50:31 localhost podman[48883]: 2026-02-23 07:50:24.298567215 +0000 UTC m=+0.044786519 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 23 02:50:31 localhost python3[48870]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d59b33e7fb841c47a47a12b18fb68b11debd968b4596c63f3177ecc7400fb1bc --format json Feb 23 02:50:31 localhost python3[48987]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:50:31 localhost python3[48987]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Feb 23 02:50:31 localhost python3[48987]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Feb 23 02:50:44 localhost sshd[49308]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:50:51 localhost podman[48999]: 2026-02-23 07:50:31.883839947 +0000 UTC m=+0.026881540 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 02:50:51 localhost python3[48987]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 6eddd23e1e6adfbfa713a747123707c02f92ffdbf1913da92f171aba1d6d7856 --format json Feb 23 02:50:51 localhost systemd[1]: tmp-crun.mvJ2O7.mount: Deactivated successfully. Feb 23 02:50:51 localhost podman[49440]: 2026-02-23 07:50:51.635154564 +0000 UTC m=+0.118668395 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, vcs-type=git, RELEASE=main, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, version=7, build-date=2026-02-09T10:25:24Z) Feb 23 02:50:51 localhost python3[49464]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:50:51 localhost python3[49464]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Feb 23 02:50:51 localhost python3[49464]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Feb 23 02:50:51 localhost podman[49440]: 2026-02-23 07:50:51.746252873 +0000 UTC m=+0.229766724 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, name=rhceph, release=1770267347, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True) Feb 23 02:50:53 localhost sshd[49644]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:51:04 localhost podman[49485]: 2026-02-23 07:50:51.783340871 +0000 UTC m=+0.035920803 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 02:51:04 localhost python3[49464]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 2c8610235afe953aa46efb141a5a988799548b22280d65a7e7ab21889422df37 --format json Feb 23 02:51:04 localhost python3[49699]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:51:04 localhost python3[49699]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Feb 23 02:51:04 localhost python3[49699]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Feb 23 02:51:12 localhost podman[49712]: 2026-02-23 07:51:04.714589934 +0000 UTC m=+0.042490858 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 23 02:51:12 localhost python3[49699]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ab5aab6d0c3ec80926032b7acf4cec1d4710f1c2daccd17ae4daa64399ec237 --format json Feb 23 02:51:12 localhost python3[49964]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:51:12 localhost python3[49964]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Feb 23 02:51:12 localhost python3[49964]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Feb 23 02:51:16 localhost podman[49977]: 2026-02-23 07:51:12.855533354 +0000 UTC m=+0.041293521 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 23 02:51:16 localhost python3[49964]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 4853142d85dba3766b28d28ae195b26f7242230fe3646e9590a7aee2dc2e0dfa --format json Feb 23 02:51:17 localhost python3[50056]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:51:17 localhost python3[50056]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Feb 23 02:51:17 localhost python3[50056]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Feb 23 02:51:19 localhost podman[50069]: 2026-02-23 07:51:17.318590851 +0000 UTC m=+0.028555403 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 23 02:51:19 localhost python3[50056]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ac6ea63c0fb4851145e847f9ced2f20804afc8472907b63a82d5866f5cf608a --format json Feb 23 02:51:19 localhost python3[50147]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:51:19 localhost python3[50147]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Feb 23 02:51:19 localhost python3[50147]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Feb 23 02:51:21 localhost podman[50161]: 2026-02-23 07:51:19.989754824 +0000 UTC m=+0.042773566 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 23 02:51:21 localhost python3[50147]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect ba1a08ea1c1207b471b1f02cee16ff456b8a812662cce16906d16de330a66d63 --format json Feb 23 02:51:22 localhost python3[50238]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:51:22 localhost python3[50238]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Feb 23 02:51:22 localhost python3[50238]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Feb 23 02:51:24 localhost podman[50250]: 2026-02-23 07:51:22.422270718 +0000 UTC m=+0.044129849 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 23 02:51:24 localhost python3[50238]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 8576d3a17e57ea28f29435f132f583320941b5aa7bf0aa02e998b09a094d1fe8 --format json Feb 23 02:51:25 localhost python3[50329]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:51:25 localhost python3[50329]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Feb 23 02:51:25 localhost python3[50329]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Feb 23 02:51:28 localhost podman[50341]: 2026-02-23 07:51:25.396542765 +0000 UTC m=+0.043706366 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 23 02:51:28 localhost python3[50329]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 7fcbf63c0504494c8fcaa07583f909a06486472a0982aeac9554c6fdbeb04c9a --format json Feb 23 02:51:29 localhost python3[50430]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Feb 23 02:51:29 localhost python3[50430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Feb 23 02:51:29 localhost python3[50430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Feb 23 02:51:31 localhost podman[50442]: 2026-02-23 07:51:29.306750704 +0000 UTC m=+0.048789324 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 23 02:51:31 localhost python3[50430]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 72ddf109f135b64d3116af7b84caaa358dc72e2e60f4c8753fa54fa65b76ba35 --format json Feb 23 02:51:31 localhost python3[50519]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:51:33 localhost ansible-async_wrapper.py[50691]: Invoked with 138320676718 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833092.9381783-84129-259736671855792/AnsiballZ_command.py _ Feb 23 02:51:33 localhost ansible-async_wrapper.py[50694]: Starting module and watcher Feb 23 02:51:33 localhost ansible-async_wrapper.py[50694]: Start watching 50695 (3600) Feb 23 02:51:33 localhost ansible-async_wrapper.py[50695]: Start module (50695) Feb 23 02:51:33 localhost ansible-async_wrapper.py[50691]: Return async_wrapper task started. Feb 23 02:51:33 localhost python3[50715]: ansible-ansible.legacy.async_status Invoked with jid=138320676718.50691 mode=status _async_dir=/tmp/.ansible_async Feb 23 02:51:36 localhost sshd[50825]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:51:37 localhost puppet-user[50714]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:37 localhost puppet-user[50714]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:37 localhost puppet-user[50714]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:37 localhost puppet-user[50714]: (file & line not available) Feb 23 02:51:37 localhost puppet-user[50714]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:37 localhost puppet-user[50714]: (file & line not available) Feb 23 02:51:37 localhost puppet-user[50714]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 23 02:51:38 localhost puppet-user[50714]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 23 02:51:38 localhost puppet-user[50714]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.12 seconds Feb 23 02:51:38 localhost puppet-user[50714]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Feb 23 02:51:38 localhost puppet-user[50714]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Feb 23 02:51:38 localhost puppet-user[50714]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Feb 23 02:51:38 localhost puppet-user[50714]: Notice: Applied catalog in 0.06 seconds Feb 23 02:51:38 localhost puppet-user[50714]: Application: Feb 23 02:51:38 localhost puppet-user[50714]: Initial environment: production Feb 23 02:51:38 localhost puppet-user[50714]: Converged environment: production Feb 23 02:51:38 localhost puppet-user[50714]: Run mode: user Feb 23 02:51:38 localhost puppet-user[50714]: Changes: Feb 23 02:51:38 localhost puppet-user[50714]: Total: 3 Feb 23 02:51:38 localhost puppet-user[50714]: Events: Feb 23 02:51:38 localhost puppet-user[50714]: Success: 3 Feb 23 02:51:38 localhost puppet-user[50714]: Total: 3 Feb 23 02:51:38 localhost puppet-user[50714]: Resources: Feb 23 02:51:38 localhost puppet-user[50714]: Changed: 3 Feb 23 02:51:38 localhost puppet-user[50714]: Out of sync: 3 Feb 23 02:51:38 localhost puppet-user[50714]: Total: 10 Feb 23 02:51:38 localhost puppet-user[50714]: Time: Feb 23 02:51:38 localhost puppet-user[50714]: Schedule: 0.00 Feb 23 02:51:38 localhost puppet-user[50714]: File: 0.00 Feb 23 02:51:38 localhost puppet-user[50714]: Exec: 0.02 Feb 23 02:51:38 localhost puppet-user[50714]: Augeas: 0.02 Feb 23 02:51:38 localhost puppet-user[50714]: Transaction evaluation: 0.05 Feb 23 02:51:38 localhost puppet-user[50714]: Catalog application: 0.06 Feb 23 02:51:38 localhost puppet-user[50714]: Config retrieval: 0.16 Feb 23 02:51:38 localhost puppet-user[50714]: Last run: 1771833098 Feb 23 02:51:38 localhost puppet-user[50714]: Filebucket: 0.00 Feb 23 02:51:38 localhost puppet-user[50714]: Total: 0.06 Feb 23 02:51:38 localhost puppet-user[50714]: Version: Feb 23 02:51:38 localhost puppet-user[50714]: Config: 1771833097 Feb 23 02:51:38 localhost puppet-user[50714]: Puppet: 7.10.0 Feb 23 02:51:38 localhost ansible-async_wrapper.py[50695]: Module complete (50695) Feb 23 02:51:38 localhost ansible-async_wrapper.py[50694]: Done in kid B. Feb 23 02:51:40 localhost sshd[51071]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:51:44 localhost python3[51088]: ansible-ansible.legacy.async_status Invoked with jid=138320676718.50691 mode=status _async_dir=/tmp/.ansible_async Feb 23 02:51:44 localhost python3[51104]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:51:45 localhost python3[51120]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:51:45 localhost python3[51168]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:51:46 localhost python3[51211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833105.2946615-84391-65683064465177/source _original_basename=tmpw3qn043_ follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:51:46 localhost python3[51241]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:51:47 localhost python3[51344]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 23 02:51:48 localhost python3[51363]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 02:51:48 localhost python3[51379]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005626463 step=1 update_config_hash_only=False Feb 23 02:51:49 localhost python3[51395]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:51:49 localhost python3[51411]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 23 02:51:50 localhost python3[51427]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Feb 23 02:51:50 localhost python3[51466]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Feb 23 02:51:51 localhost podman[51642]: 2026-02-23 07:51:51.271615711 +0000 UTC m=+0.068008691 container create 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=container-puppet-iscsid, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 02:51:51 localhost podman[51664]: 2026-02-23 07:51:51.306306177 +0000 UTC m=+0.088385513 container create 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, architecture=x86_64, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, container_name=container-puppet-metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510) Feb 23 02:51:51 localhost podman[51674]: 2026-02-23 07:51:51.315433701 +0000 UTC m=+0.074187313 container create eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_puppet_step1, container_name=container-puppet-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 02:51:51 localhost systemd[1]: Started libpod-conmon-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc.scope. Feb 23 02:51:51 localhost podman[51673]: 2026-02-23 07:51:51.330757086 +0000 UTC m=+0.101187061 container create 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, container_name=container-puppet-crond, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, tcib_managed=true) Feb 23 02:51:51 localhost podman[51684]: 2026-02-23 07:51:51.342453639 +0000 UTC m=+0.085646998 container create ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=container-puppet-nova_libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public) Feb 23 02:51:51 localhost systemd[1]: Started libcrun container. Feb 23 02:51:51 localhost systemd[1]: Started libpod-conmon-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d.scope. Feb 23 02:51:51 localhost systemd[1]: Started libpod-conmon-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4.scope. Feb 23 02:51:51 localhost podman[51642]: 2026-02-23 07:51:51.243616222 +0000 UTC m=+0.040009232 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 23 02:51:51 localhost systemd[1]: Started libpod-conmon-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22.scope. Feb 23 02:51:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:51 localhost systemd[1]: Started libcrun container. Feb 23 02:51:51 localhost podman[51673]: 2026-02-23 07:51:51.253307853 +0000 UTC m=+0.023737818 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 23 02:51:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6de1ea423bd6d005b3c98cfa65644155837a90a885b901ec0c789a8fa4573360/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:51 localhost systemd[1]: Started libcrun container. Feb 23 02:51:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a309f49ba9a6af0e1193f20a6ae2dd065eaab6a23a55dd0b287fffd33cd7437/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:51 localhost podman[51664]: 2026-02-23 07:51:51.266187213 +0000 UTC m=+0.048266569 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 23 02:51:51 localhost podman[51664]: 2026-02-23 07:51:51.366991811 +0000 UTC m=+0.149071157 container init 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, architecture=x86_64, container_name=container-puppet-metrics_qdr, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 02:51:51 localhost podman[51674]: 2026-02-23 07:51:51.26966054 +0000 UTC m=+0.028414182 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 23 02:51:51 localhost podman[51674]: 2026-02-23 07:51:51.368768076 +0000 UTC m=+0.127521688 container init eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., vcs-type=git, container_name=container-puppet-collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 23 02:51:51 localhost systemd[1]: Started libcrun container. Feb 23 02:51:51 localhost systemd[1]: Started libpod-conmon-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757.scope. Feb 23 02:51:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/912153276d119d62292ee43dc157a09b9029351ec12b42dcebd4c826260b5572/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:51 localhost podman[51664]: 2026-02-23 07:51:51.379079996 +0000 UTC m=+0.161159342 container start 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:14Z, distribution-scope=public, url=https://www.redhat.com, container_name=container-puppet-metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Feb 23 02:51:51 localhost podman[51664]: 2026-02-23 07:51:51.382103549 +0000 UTC m=+0.164182905 container attach 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=) Feb 23 02:51:51 localhost podman[51684]: 2026-02-23 07:51:51.292272302 +0000 UTC m=+0.035465671 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 02:51:51 localhost systemd[1]: Started libcrun container. Feb 23 02:51:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1c03aa8e256d3d38d275b9e911c2e9e69db76da13bfae548890815046fc902e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:52 localhost podman[51684]: 2026-02-23 07:51:52.338242141 +0000 UTC m=+1.081435560 container init ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, container_name=container-puppet-nova_libvirt, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 02:51:52 localhost systemd[1]: tmp-crun.n3i5Fg.mount: Deactivated successfully. Feb 23 02:51:52 localhost podman[51684]: 2026-02-23 07:51:52.360708528 +0000 UTC m=+1.103901927 container start ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, release=1766032510, config_id=tripleo_puppet_step1, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-nova_libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, version=17.1.13) Feb 23 02:51:52 localhost podman[51684]: 2026-02-23 07:51:52.361034498 +0000 UTC m=+1.104227897 container attach ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2026-01-12T23:31:49Z, container_name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 02:51:52 localhost podman[51642]: 2026-02-23 07:51:52.376616801 +0000 UTC m=+1.173009821 container init 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:34:43Z, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, container_name=container-puppet-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 02:51:52 localhost podman[51674]: 2026-02-23 07:51:52.388370356 +0000 UTC m=+1.147124008 container start eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-collectd, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 02:51:52 localhost podman[51674]: 2026-02-23 07:51:52.389282374 +0000 UTC m=+1.148036076 container attach eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, tcib_managed=true, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, container_name=container-puppet-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 02:51:52 localhost podman[51642]: 2026-02-23 07:51:52.396963383 +0000 UTC m=+1.193356393 container start 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public) Feb 23 02:51:52 localhost podman[51642]: 2026-02-23 07:51:52.398260423 +0000 UTC m=+1.194653473 container attach 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=container-puppet-iscsid) Feb 23 02:51:52 localhost podman[51673]: 2026-02-23 07:51:52.439071369 +0000 UTC m=+1.209501344 container init 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-type=git, config_id=tripleo_puppet_step1, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, tcib_managed=true, container_name=container-puppet-crond, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 02:51:52 localhost podman[51673]: 2026-02-23 07:51:52.447313015 +0000 UTC m=+1.217743030 container start 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 02:51:52 localhost podman[51673]: 2026-02-23 07:51:52.447585224 +0000 UTC m=+1.218015219 container attach 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=container-puppet-crond, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 02:51:53 localhost podman[51543]: 2026-02-23 07:51:51.133301009 +0000 UTC m=+0.042195790 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 23 02:51:53 localhost systemd[1]: tmp-crun.Qptawy.mount: Deactivated successfully. Feb 23 02:51:53 localhost podman[51868]: 2026-02-23 07:51:53.396437978 +0000 UTC m=+0.071921893 container create 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:24Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1) Feb 23 02:51:53 localhost systemd[1]: Started libpod-conmon-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7.scope. Feb 23 02:51:53 localhost systemd[1]: Started libcrun container. Feb 23 02:51:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af4f3c6c37d07fdae63d07a90ecbe84180ed2951bbcccf4f59b147e3f8b29057/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:53 localhost podman[51868]: 2026-02-23 07:51:53.351950728 +0000 UTC m=+0.027434613 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 23 02:51:53 localhost podman[51868]: 2026-02-23 07:51:53.467630827 +0000 UTC m=+0.143114702 container init 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:24Z, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-central-container, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2026-01-12T23:07:24Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Feb 23 02:51:53 localhost podman[51868]: 2026-02-23 07:51:53.476828992 +0000 UTC m=+0.152312847 container start 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=container-puppet-ceilometer, build-date=2026-01-12T23:07:24Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container, io.buildah.version=1.41.5) Feb 23 02:51:53 localhost podman[51868]: 2026-02-23 07:51:53.477926427 +0000 UTC m=+0.153410342 container attach 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T23:07:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=container-puppet-ceilometer, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, url=https://www.redhat.com, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T23:07:24Z, com.redhat.component=openstack-ceilometer-central-container) Feb 23 02:51:54 localhost ovs-vsctl[52067]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Feb 23 02:51:54 localhost puppet-user[51801]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:54 localhost puppet-user[51801]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:54 localhost puppet-user[51801]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:54 localhost puppet-user[51801]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51798]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:54 localhost puppet-user[51798]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:54 localhost puppet-user[51798]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:54 localhost puppet-user[51798]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51801]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:54 localhost puppet-user[51801]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51798]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:54 localhost puppet-user[51798]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51801]: Notice: Accepting previously invalid value for target type 'Integer' Feb 23 02:51:54 localhost puppet-user[51833]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:54 localhost puppet-user[51833]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:54 localhost puppet-user[51833]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:54 localhost puppet-user[51833]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51819]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:54 localhost puppet-user[51819]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:54 localhost puppet-user[51819]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:54 localhost puppet-user[51819]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51801]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.11 seconds Feb 23 02:51:54 localhost puppet-user[51806]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:54 localhost puppet-user[51806]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:54 localhost puppet-user[51806]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51833]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:54 localhost puppet-user[51833]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51819]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:54 localhost puppet-user[51819]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Feb 23 02:51:54 localhost puppet-user[51833]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.07 seconds Feb 23 02:51:54 localhost puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Feb 23 02:51:54 localhost puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Feb 23 02:51:54 localhost puppet-user[51806]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:54 localhost puppet-user[51806]: (file & line not available) Feb 23 02:51:54 localhost puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Feb 23 02:51:54 localhost puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}29d074022d6cafdf94866dc1f307d9105f785dc4a34888c55632376b0a0d6303' Feb 23 02:51:54 localhost puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Feb 23 02:51:54 localhost puppet-user[51801]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Feb 23 02:51:54 localhost puppet-user[51801]: Notice: Applied catalog in 0.03 seconds Feb 23 02:51:54 localhost puppet-user[51801]: Application: Feb 23 02:51:54 localhost puppet-user[51801]: Initial environment: production Feb 23 02:51:54 localhost puppet-user[51801]: Converged environment: production Feb 23 02:51:54 localhost puppet-user[51801]: Run mode: user Feb 23 02:51:54 localhost puppet-user[51801]: Changes: Feb 23 02:51:54 localhost puppet-user[51801]: Total: 7 Feb 23 02:51:54 localhost puppet-user[51801]: Events: Feb 23 02:51:54 localhost puppet-user[51801]: Success: 7 Feb 23 02:51:54 localhost puppet-user[51801]: Total: 7 Feb 23 02:51:54 localhost puppet-user[51801]: Resources: Feb 23 02:51:54 localhost puppet-user[51801]: Skipped: 13 Feb 23 02:51:54 localhost puppet-user[51801]: Changed: 5 Feb 23 02:51:54 localhost puppet-user[51801]: Out of sync: 5 Feb 23 02:51:54 localhost puppet-user[51801]: Total: 20 Feb 23 02:51:54 localhost puppet-user[51801]: Time: Feb 23 02:51:54 localhost puppet-user[51801]: File: 0.02 Feb 23 02:51:54 localhost puppet-user[51801]: Transaction evaluation: 0.03 Feb 23 02:51:54 localhost puppet-user[51801]: Catalog application: 0.03 Feb 23 02:51:54 localhost puppet-user[51801]: Config retrieval: 0.14 Feb 23 02:51:54 localhost puppet-user[51801]: Last run: 1771833114 Feb 23 02:51:54 localhost puppet-user[51801]: Total: 0.03 Feb 23 02:51:54 localhost puppet-user[51801]: Version: Feb 23 02:51:54 localhost puppet-user[51801]: Config: 1771833114 Feb 23 02:51:54 localhost puppet-user[51801]: Puppet: 7.10.0 Feb 23 02:51:54 localhost puppet-user[51819]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.10 seconds Feb 23 02:51:54 localhost puppet-user[51833]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Feb 23 02:51:54 localhost puppet-user[51833]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Feb 23 02:51:54 localhost puppet-user[51833]: Notice: Applied catalog in 0.04 seconds Feb 23 02:51:54 localhost puppet-user[51833]: Application: Feb 23 02:51:54 localhost puppet-user[51833]: Initial environment: production Feb 23 02:51:54 localhost puppet-user[51833]: Converged environment: production Feb 23 02:51:54 localhost puppet-user[51833]: Run mode: user Feb 23 02:51:54 localhost puppet-user[51833]: Changes: Feb 23 02:51:54 localhost puppet-user[51833]: Total: 2 Feb 23 02:51:54 localhost puppet-user[51833]: Events: Feb 23 02:51:54 localhost puppet-user[51833]: Success: 2 Feb 23 02:51:54 localhost puppet-user[51833]: Total: 2 Feb 23 02:51:54 localhost puppet-user[51833]: Resources: Feb 23 02:51:54 localhost puppet-user[51833]: Changed: 2 Feb 23 02:51:54 localhost puppet-user[51833]: Out of sync: 2 Feb 23 02:51:54 localhost puppet-user[51833]: Skipped: 7 Feb 23 02:51:54 localhost puppet-user[51833]: Total: 9 Feb 23 02:51:54 localhost puppet-user[51833]: Time: Feb 23 02:51:54 localhost puppet-user[51833]: File: 0.01 Feb 23 02:51:54 localhost puppet-user[51833]: Cron: 0.01 Feb 23 02:51:54 localhost puppet-user[51833]: Transaction evaluation: 0.04 Feb 23 02:51:54 localhost puppet-user[51833]: Catalog application: 0.04 Feb 23 02:51:54 localhost puppet-user[51833]: Config retrieval: 0.11 Feb 23 02:51:54 localhost puppet-user[51833]: Last run: 1771833114 Feb 23 02:51:54 localhost puppet-user[51833]: Total: 0.04 Feb 23 02:51:54 localhost puppet-user[51833]: Version: Feb 23 02:51:54 localhost puppet-user[51833]: Config: 1771833114 Feb 23 02:51:54 localhost puppet-user[51833]: Puppet: 7.10.0 Feb 23 02:51:54 localhost puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Feb 23 02:51:54 localhost puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Feb 23 02:51:54 localhost puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Feb 23 02:51:54 localhost puppet-user[51798]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.29 seconds Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Feb 23 02:51:54 localhost puppet-user[51806]: in a future release. Use nova::cinder::os_region_name instead Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Feb 23 02:51:54 localhost puppet-user[51806]: in a future release. Use nova::cinder::catalog_info instead Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Feb 23 02:51:54 localhost systemd[1]: libpod-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d.scope: Deactivated successfully. Feb 23 02:51:54 localhost systemd[1]: libpod-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d.scope: Consumed 2.104s CPU time. Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}dee3f10cb1ff461ac3f1e743a5ef3f06993398c6c829895de1dae7f242a64b39' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Feb 23 02:51:54 localhost systemd[1]: libpod-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22.scope: Deactivated successfully. Feb 23 02:51:54 localhost systemd[1]: libpod-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22.scope: Consumed 2.128s CPU time. Feb 23 02:51:54 localhost podman[51673]: 2026-02-23 07:51:54.7013199 +0000 UTC m=+3.471749885 container died 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, batch=17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, container_name=container-puppet-crond) Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Feb 23 02:51:54 localhost puppet-user[51798]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Feb 23 02:51:54 localhost puppet-user[51798]: Notice: Applied catalog in 0.21 seconds Feb 23 02:51:54 localhost puppet-user[51798]: Application: Feb 23 02:51:54 localhost puppet-user[51798]: Initial environment: production Feb 23 02:51:54 localhost puppet-user[51798]: Converged environment: production Feb 23 02:51:54 localhost puppet-user[51798]: Run mode: user Feb 23 02:51:54 localhost puppet-user[51798]: Changes: Feb 23 02:51:54 localhost puppet-user[51798]: Total: 43 Feb 23 02:51:54 localhost puppet-user[51798]: Events: Feb 23 02:51:54 localhost puppet-user[51798]: Success: 43 Feb 23 02:51:54 localhost puppet-user[51798]: Total: 43 Feb 23 02:51:54 localhost puppet-user[51798]: Resources: Feb 23 02:51:54 localhost puppet-user[51798]: Skipped: 14 Feb 23 02:51:54 localhost puppet-user[51798]: Changed: 38 Feb 23 02:51:54 localhost puppet-user[51798]: Out of sync: 38 Feb 23 02:51:54 localhost puppet-user[51798]: Total: 82 Feb 23 02:51:54 localhost puppet-user[51798]: Time: Feb 23 02:51:54 localhost puppet-user[51798]: File: 0.11 Feb 23 02:51:54 localhost puppet-user[51798]: Transaction evaluation: 0.20 Feb 23 02:51:54 localhost puppet-user[51798]: Catalog application: 0.21 Feb 23 02:51:54 localhost puppet-user[51798]: Config retrieval: 0.39 Feb 23 02:51:54 localhost puppet-user[51798]: Last run: 1771833114 Feb 23 02:51:54 localhost puppet-user[51798]: Concat fragment: 0.00 Feb 23 02:51:54 localhost puppet-user[51798]: Concat file: 0.00 Feb 23 02:51:54 localhost puppet-user[51798]: Total: 0.21 Feb 23 02:51:54 localhost puppet-user[51798]: Version: Feb 23 02:51:54 localhost puppet-user[51798]: Config: 1771833114 Feb 23 02:51:54 localhost puppet-user[51798]: Puppet: 7.10.0 Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Feb 23 02:51:54 localhost podman[52341]: 2026-02-23 07:51:54.759286269 +0000 UTC m=+0.122292306 container died 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, container_name=container-puppet-metrics_qdr, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Feb 23 02:51:54 localhost systemd[1]: tmp-crun.kXysIb.mount: Deactivated successfully. Feb 23 02:51:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d-userdata-shm.mount: Deactivated successfully. Feb 23 02:51:54 localhost puppet-user[51819]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Feb 23 02:51:54 localhost puppet-user[51819]: Notice: Applied catalog in 0.48 seconds Feb 23 02:51:54 localhost puppet-user[51819]: Application: Feb 23 02:51:54 localhost puppet-user[51819]: Initial environment: production Feb 23 02:51:54 localhost puppet-user[51819]: Converged environment: production Feb 23 02:51:54 localhost puppet-user[51819]: Run mode: user Feb 23 02:51:54 localhost puppet-user[51819]: Changes: Feb 23 02:51:54 localhost puppet-user[51819]: Total: 4 Feb 23 02:51:54 localhost puppet-user[51819]: Events: Feb 23 02:51:54 localhost puppet-user[51819]: Success: 4 Feb 23 02:51:54 localhost puppet-user[51819]: Total: 4 Feb 23 02:51:54 localhost puppet-user[51819]: Resources: Feb 23 02:51:54 localhost puppet-user[51819]: Changed: 4 Feb 23 02:51:54 localhost puppet-user[51819]: Out of sync: 4 Feb 23 02:51:54 localhost puppet-user[51819]: Skipped: 8 Feb 23 02:51:54 localhost puppet-user[51819]: Total: 13 Feb 23 02:51:54 localhost puppet-user[51819]: Time: Feb 23 02:51:54 localhost puppet-user[51819]: File: 0.00 Feb 23 02:51:54 localhost puppet-user[51819]: Exec: 0.06 Feb 23 02:51:54 localhost puppet-user[51819]: Config retrieval: 0.13 Feb 23 02:51:54 localhost puppet-user[51819]: Augeas: 0.40 Feb 23 02:51:54 localhost puppet-user[51819]: Transaction evaluation: 0.47 Feb 23 02:51:54 localhost puppet-user[51819]: Catalog application: 0.48 Feb 23 02:51:54 localhost puppet-user[51819]: Last run: 1771833114 Feb 23 02:51:54 localhost puppet-user[51819]: Total: 0.48 Feb 23 02:51:54 localhost puppet-user[51819]: Version: Feb 23 02:51:54 localhost puppet-user[51819]: Config: 1771833114 Feb 23 02:51:54 localhost puppet-user[51819]: Puppet: 7.10.0 Feb 23 02:51:54 localhost podman[52341]: 2026-02-23 07:51:54.816079581 +0000 UTC m=+0.179085588 container cleanup 97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13) Feb 23 02:51:54 localhost systemd[1]: libpod-conmon-97c7bbaed7d61c77b044fd13301ae7ea20d1e615cfd076275e14c85aa945e52d.scope: Deactivated successfully. Feb 23 02:51:54 localhost podman[52368]: 2026-02-23 07:51:54.827857797 +0000 UTC m=+0.114482453 container cleanup 6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, managed_by=tripleo_ansible, container_name=container-puppet-crond, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 02:51:54 localhost systemd[1]: libpod-conmon-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22.scope: Deactivated successfully. Feb 23 02:51:54 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 23 02:51:54 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 23 02:51:54 localhost puppet-user[51806]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used. Feb 23 02:51:55 localhost systemd[1]: libpod-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc.scope: Deactivated successfully. Feb 23 02:51:55 localhost systemd[1]: libpod-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc.scope: Consumed 2.511s CPU time. Feb 23 02:51:55 localhost podman[51642]: 2026-02-23 07:51:55.125975898 +0000 UTC m=+3.922368908 container died 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, container_name=container-puppet-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 02:51:55 localhost systemd[1]: libpod-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4.scope: Deactivated successfully. Feb 23 02:51:55 localhost systemd[1]: libpod-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4.scope: Consumed 2.589s CPU time. Feb 23 02:51:55 localhost podman[51674]: 2026-02-23 07:51:55.175483965 +0000 UTC m=+3.934237627 container died eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-collectd, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 02:51:55 localhost podman[52556]: 2026-02-23 07:51:55.205984241 +0000 UTC m=+0.059808887 container create bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, com.redhat.component=openstack-rsyslog-container, name=rhosp-rhel9/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=container-puppet-rsyslog, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., io.buildah.version=1.41.5) Feb 23 02:51:55 localhost systemd[1]: Started libpod-conmon-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5.scope. Feb 23 02:51:55 localhost systemd[1]: Started libcrun container. Feb 23 02:51:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/df719217e40f9ffd193139cfaeeaaebcf46866a6b616db04d8d1f0793e86d521/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:55 localhost podman[52556]: 2026-02-23 07:51:55.261542725 +0000 UTC m=+0.115367381 container init bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-rsyslog, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog) Feb 23 02:51:55 localhost podman[52543]: 2026-02-23 07:51:55.268185962 +0000 UTC m=+0.134621899 container cleanup 9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=container-puppet-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 02:51:55 localhost systemd[1]: var-lib-containers-storage-overlay-912153276d119d62292ee43dc157a09b9029351ec12b42dcebd4c826260b5572-merged.mount: Deactivated successfully. Feb 23 02:51:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b008fe2ff44586e63a5b97cc40b6845717ae89a6c7db0ce7b334251da9dbb22-userdata-shm.mount: Deactivated successfully. Feb 23 02:51:55 localhost systemd[1]: var-lib-containers-storage-overlay-1a309f49ba9a6af0e1193f20a6ae2dd065eaab6a23a55dd0b287fffd33cd7437-merged.mount: Deactivated successfully. Feb 23 02:51:55 localhost systemd[1]: var-lib-containers-storage-overlay-6de1ea423bd6d005b3c98cfa65644155837a90a885b901ec0c789a8fa4573360-merged.mount: Deactivated successfully. Feb 23 02:51:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4-userdata-shm.mount: Deactivated successfully. Feb 23 02:51:55 localhost systemd[1]: var-lib-containers-storage-overlay-bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1-merged.mount: Deactivated successfully. Feb 23 02:51:55 localhost podman[52556]: 2026-02-23 07:51:55.176585269 +0000 UTC m=+0.030409935 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 23 02:51:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc-userdata-shm.mount: Deactivated successfully. Feb 23 02:51:55 localhost systemd[1]: libpod-conmon-9d8e4721b422d8307f077c5cd6534db0034288ebb74472da492d4e73cd4218cc.scope: Deactivated successfully. Feb 23 02:51:55 localhost podman[52582]: 2026-02-23 07:51:55.286367235 +0000 UTC m=+0.100681345 container cleanup eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 02:51:55 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 23 02:51:55 localhost systemd[1]: libpod-conmon-eacb05cb99bd4011e7883f35a49a01d17539879336879b6da96321ac1ca91be4.scope: Deactivated successfully. Feb 23 02:51:55 localhost podman[52556]: 2026-02-23 07:51:55.329929387 +0000 UTC m=+0.183754033 container start bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, config_id=tripleo_puppet_step1, container_name=container-puppet-rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Feb 23 02:51:55 localhost podman[52556]: 2026-02-23 07:51:55.330239137 +0000 UTC m=+0.184063823 container attach bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.13, vcs-type=git, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-rsyslog, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public) Feb 23 02:51:55 localhost podman[52613]: 2026-02-23 07:51:55.332542149 +0000 UTC m=+0.073283876 container create 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-ovn_controller, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, distribution-scope=public) Feb 23 02:51:55 localhost systemd[1]: Started libpod-conmon-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95.scope. Feb 23 02:51:55 localhost systemd[1]: Started libcrun container. Feb 23 02:51:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:55 localhost podman[52613]: 2026-02-23 07:51:55.382801108 +0000 UTC m=+0.123542835 container init 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, container_name=container-puppet-ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 02:51:55 localhost podman[52613]: 2026-02-23 07:51:55.394201722 +0000 UTC m=+0.134943439 container start 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=container-puppet-ovn_controller) Feb 23 02:51:55 localhost podman[52613]: 2026-02-23 07:51:55.394344686 +0000 UTC m=+0.135086443 container attach 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, container_name=container-puppet-ovn_controller, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public) Feb 23 02:51:55 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 23 02:51:55 localhost podman[52613]: 2026-02-23 07:51:55.29941463 +0000 UTC m=+0.040156387 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 23 02:51:55 localhost puppet-user[51942]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:55 localhost puppet-user[51942]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:55 localhost puppet-user[51942]: (file & line not available) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:55 localhost puppet-user[51942]: (file & line not available) Feb 23 02:51:55 localhost puppet-user[51806]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 1.27 seconds Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Feb 23 02:51:55 localhost puppet-user[51942]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Feb 23 02:51:55 localhost puppet-user[51942]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.36 seconds Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}489b6455d50f9ee989125e261ff880fe0ce273a5c46439278b09842d2e1f5116' Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Feb 23 02:51:55 localhost puppet-user[51806]: Warning: Empty environment setting 'TLS_PASSWORD' Feb 23 02:51:55 localhost puppet-user[51806]: (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182) Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}8656b3c96dc5b23eeff252eb63947bbb521645e181af749f7bc85fd2f92d7747' Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Feb 23 02:51:55 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Feb 23 02:51:55 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Feb 23 02:51:56 localhost puppet-user[51942]: Notice: Applied catalog in 0.59 seconds Feb 23 02:51:56 localhost puppet-user[51942]: Application: Feb 23 02:51:56 localhost puppet-user[51942]: Initial environment: production Feb 23 02:51:56 localhost puppet-user[51942]: Converged environment: production Feb 23 02:51:56 localhost puppet-user[51942]: Run mode: user Feb 23 02:51:56 localhost puppet-user[51942]: Changes: Feb 23 02:51:56 localhost puppet-user[51942]: Total: 31 Feb 23 02:51:56 localhost puppet-user[51942]: Events: Feb 23 02:51:56 localhost puppet-user[51942]: Success: 31 Feb 23 02:51:56 localhost puppet-user[51942]: Total: 31 Feb 23 02:51:56 localhost puppet-user[51942]: Resources: Feb 23 02:51:56 localhost puppet-user[51942]: Skipped: 22 Feb 23 02:51:56 localhost puppet-user[51942]: Changed: 31 Feb 23 02:51:56 localhost puppet-user[51942]: Out of sync: 31 Feb 23 02:51:56 localhost puppet-user[51942]: Total: 151 Feb 23 02:51:56 localhost puppet-user[51942]: Time: Feb 23 02:51:56 localhost puppet-user[51942]: Package: 0.02 Feb 23 02:51:56 localhost puppet-user[51942]: Config retrieval: 0.43 Feb 23 02:51:56 localhost puppet-user[51942]: Ceilometer config: 0.49 Feb 23 02:51:56 localhost puppet-user[51942]: Transaction evaluation: 0.58 Feb 23 02:51:56 localhost puppet-user[51942]: Catalog application: 0.59 Feb 23 02:51:56 localhost puppet-user[51942]: Last run: 1771833116 Feb 23 02:51:56 localhost puppet-user[51942]: Resources: 0.00 Feb 23 02:51:56 localhost puppet-user[51942]: Total: 0.59 Feb 23 02:51:56 localhost puppet-user[51942]: Version: Feb 23 02:51:56 localhost puppet-user[51942]: Config: 1771833115 Feb 23 02:51:56 localhost puppet-user[51942]: Puppet: 7.10.0 Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Feb 23 02:51:56 localhost systemd[1]: libpod-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7.scope: Deactivated successfully. Feb 23 02:51:56 localhost systemd[1]: libpod-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7.scope: Consumed 3.036s CPU time. Feb 23 02:51:56 localhost podman[51868]: 2026-02-23 07:51:56.869771881 +0000 UTC m=+3.545255786 container died 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:24Z, vcs-type=git, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:24Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, container_name=container-puppet-ceilometer, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container) Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Feb 23 02:51:56 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Feb 23 02:51:56 localhost systemd[1]: tmp-crun.TJGeAN.mount: Deactivated successfully. Feb 23 02:51:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7-userdata-shm.mount: Deactivated successfully. Feb 23 02:51:57 localhost podman[52837]: 2026-02-23 07:51:57.027589329 +0000 UTC m=+0.143620189 container cleanup 5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-central, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-central, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ceilometer, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, org.opencontainers.image.created=2026-01-12T23:07:24Z, build-date=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, release=1766032510, maintainer=OpenStack TripleO Team) Feb 23 02:51:57 localhost systemd[1]: libpod-conmon-5d2e10e62642c4fc5eee57693129d6f6e40295ab192929a8d8f0657a22ca4cf7.scope: Deactivated successfully. Feb 23 02:51:57 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Feb 23 02:51:57 localhost puppet-user[52737]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:57 localhost puppet-user[52737]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:57 localhost puppet-user[52737]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:57 localhost puppet-user[52737]: (file & line not available) Feb 23 02:51:57 localhost puppet-user[52644]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:51:57 localhost puppet-user[52644]: (file: /etc/puppet/hiera.yaml) Feb 23 02:51:57 localhost puppet-user[52644]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:51:57 localhost puppet-user[52644]: (file & line not available) Feb 23 02:51:57 localhost puppet-user[52737]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:57 localhost puppet-user[52737]: (file & line not available) Feb 23 02:51:57 localhost puppet-user[52644]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:51:57 localhost puppet-user[52644]: (file & line not available) Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Feb 23 02:51:57 localhost systemd[1]: var-lib-containers-storage-overlay-af4f3c6c37d07fdae63d07a90ecbe84180ed2951bbcccf4f59b147e3f8b29057-merged.mount: Deactivated successfully. Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Feb 23 02:51:57 localhost puppet-user[52644]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.21 seconds Feb 23 02:51:57 localhost puppet-user[52737]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.21 seconds Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53007]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642 Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Feb 23 02:51:57 localhost puppet-user[52644]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Feb 23 02:51:57 localhost puppet-user[52644]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Feb 23 02:51:57 localhost puppet-user[52644]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}5c90e46cff2762d304afc2b09980a80c0046036381fac743e7540f8aa8df54d3' Feb 23 02:51:57 localhost puppet-user[52644]: Notice: Applied catalog in 0.10 seconds Feb 23 02:51:57 localhost puppet-user[52644]: Application: Feb 23 02:51:57 localhost puppet-user[52644]: Initial environment: production Feb 23 02:51:57 localhost puppet-user[52644]: Converged environment: production Feb 23 02:51:57 localhost puppet-user[52644]: Run mode: user Feb 23 02:51:57 localhost puppet-user[52644]: Changes: Feb 23 02:51:57 localhost puppet-user[52644]: Total: 3 Feb 23 02:51:57 localhost puppet-user[52644]: Events: Feb 23 02:51:57 localhost puppet-user[52644]: Success: 3 Feb 23 02:51:57 localhost puppet-user[52644]: Total: 3 Feb 23 02:51:57 localhost puppet-user[52644]: Resources: Feb 23 02:51:57 localhost puppet-user[52644]: Skipped: 11 Feb 23 02:51:57 localhost puppet-user[52644]: Changed: 3 Feb 23 02:51:57 localhost puppet-user[52644]: Out of sync: 3 Feb 23 02:51:57 localhost puppet-user[52644]: Total: 25 Feb 23 02:51:57 localhost puppet-user[52644]: Time: Feb 23 02:51:57 localhost puppet-user[52644]: Concat file: 0.00 Feb 23 02:51:57 localhost puppet-user[52644]: Concat fragment: 0.00 Feb 23 02:51:57 localhost puppet-user[52644]: File: 0.01 Feb 23 02:51:57 localhost puppet-user[52644]: Transaction evaluation: 0.09 Feb 23 02:51:57 localhost puppet-user[52644]: Catalog application: 0.10 Feb 23 02:51:57 localhost puppet-user[52644]: Config retrieval: 0.25 Feb 23 02:51:57 localhost puppet-user[52644]: Last run: 1771833117 Feb 23 02:51:57 localhost puppet-user[52644]: Total: 0.10 Feb 23 02:51:57 localhost puppet-user[52644]: Version: Feb 23 02:51:57 localhost puppet-user[52644]: Config: 1771833117 Feb 23 02:51:57 localhost puppet-user[52644]: Puppet: 7.10.0 Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53009]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53011]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.106 Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53019]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005626463.localdomain Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005626463.novalocal' to 'np0005626463.localdomain' Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53022]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53026]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53029]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53031]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53047]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}9109162380f9c461e3d8ec780edb8a48cdd59dabd84e70a5fe7d1088fe416c1b' Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53049]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53056]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:96:08:8c Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created Feb 23 02:51:57 localhost systemd[1]: libpod-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5.scope: Deactivated successfully. Feb 23 02:51:57 localhost systemd[1]: libpod-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5.scope: Consumed 2.415s CPU time. Feb 23 02:51:57 localhost podman[52556]: 2026-02-23 07:51:57.822500915 +0000 UTC m=+2.676325631 container died bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-rsyslog) Feb 23 02:51:57 localhost ovs-vsctl[53064]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53072]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Feb 23 02:51:57 localhost ovs-vsctl[53078]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Feb 23 02:51:57 localhost puppet-user[52737]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Feb 23 02:51:57 localhost puppet-user[52737]: Notice: Applied catalog in 0.51 seconds Feb 23 02:51:57 localhost puppet-user[52737]: Application: Feb 23 02:51:57 localhost puppet-user[52737]: Initial environment: production Feb 23 02:51:57 localhost puppet-user[52737]: Converged environment: production Feb 23 02:51:57 localhost puppet-user[52737]: Run mode: user Feb 23 02:51:57 localhost puppet-user[52737]: Changes: Feb 23 02:51:57 localhost puppet-user[52737]: Total: 14 Feb 23 02:51:57 localhost puppet-user[52737]: Events: Feb 23 02:51:57 localhost puppet-user[52737]: Success: 14 Feb 23 02:51:57 localhost puppet-user[52737]: Total: 14 Feb 23 02:51:57 localhost puppet-user[52737]: Resources: Feb 23 02:51:57 localhost puppet-user[52737]: Skipped: 12 Feb 23 02:51:57 localhost puppet-user[52737]: Changed: 14 Feb 23 02:51:57 localhost puppet-user[52737]: Out of sync: 14 Feb 23 02:51:57 localhost puppet-user[52737]: Total: 29 Feb 23 02:51:57 localhost puppet-user[52737]: Time: Feb 23 02:51:57 localhost puppet-user[52737]: Exec: 0.02 Feb 23 02:51:57 localhost puppet-user[52737]: Config retrieval: 0.27 Feb 23 02:51:57 localhost puppet-user[52737]: Vs config: 0.43 Feb 23 02:51:57 localhost puppet-user[52737]: Transaction evaluation: 0.50 Feb 23 02:51:57 localhost puppet-user[52737]: Catalog application: 0.51 Feb 23 02:51:57 localhost puppet-user[52737]: Last run: 1771833117 Feb 23 02:51:57 localhost puppet-user[52737]: Total: 0.51 Feb 23 02:51:57 localhost puppet-user[52737]: Version: Feb 23 02:51:57 localhost puppet-user[52737]: Config: 1771833117 Feb 23 02:51:57 localhost puppet-user[52737]: Puppet: 7.10.0 Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created Feb 23 02:51:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5-userdata-shm.mount: Deactivated successfully. Feb 23 02:51:57 localhost systemd[1]: var-lib-containers-storage-overlay-df719217e40f9ffd193139cfaeeaaebcf46866a6b616db04d8d1f0793e86d521-merged.mount: Deactivated successfully. Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Feb 23 02:51:57 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Feb 23 02:51:58 localhost podman[53065]: 2026-02-23 07:51:58.018105975 +0000 UTC m=+0.181784502 container cleanup bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, build-date=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=container-puppet-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog) Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Feb 23 02:51:58 localhost systemd[1]: libpod-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95.scope: Deactivated successfully. Feb 23 02:51:58 localhost systemd[1]: libpod-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95.scope: Consumed 2.834s CPU time. Feb 23 02:51:58 localhost podman[52613]: 2026-02-23 07:51:58.47199408 +0000 UTC m=+3.212735817 container died 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, config_id=tripleo_puppet_step1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, container_name=container-puppet-ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 02:51:58 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Feb 23 02:51:58 localhost systemd[1]: libpod-conmon-bd0a74a8cd004f505267dc81a174257d2b7a291386b2f29bde186b94923c26a5.scope: Deactivated successfully. Feb 23 02:51:58 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 23 02:51:58 localhost systemd[1]: tmp-crun.vSCaLd.mount: Deactivated successfully. Feb 23 02:51:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95-userdata-shm.mount: Deactivated successfully. Feb 23 02:51:58 localhost systemd[1]: var-lib-containers-storage-overlay-4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b-merged.mount: Deactivated successfully. Feb 23 02:51:59 localhost podman[53130]: 2026-02-23 07:51:59.142393025 +0000 UTC m=+0.663382128 container cleanup 99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Feb 23 02:51:59 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 23 02:51:59 localhost systemd[1]: libpod-conmon-99554d424a6d099346c4a7ba038c3fd113ee65313c60f1a445ba39a36fbd8b95.scope: Deactivated successfully. Feb 23 02:51:59 localhost podman[52775]: 2026-02-23 07:51:55.673672084 +0000 UTC m=+0.032370055 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Feb 23 02:51:59 localhost podman[53257]: 2026-02-23 07:51:59.442343683 +0000 UTC m=+0.082704008 container create 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, name=rhosp-rhel9/openstack-neutron-server, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, build-date=2026-01-12T22:57:35Z, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-server, container_name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 02:51:59 localhost systemd[1]: Started libpod-conmon-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59.scope. Feb 23 02:51:59 localhost podman[53257]: 2026-02-23 07:51:59.395487538 +0000 UTC m=+0.035847893 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 23 02:51:59 localhost systemd[1]: Started libcrun container. Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Feb 23 02:51:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40ec17e1ee8e7a0751fc146049cedcb53f05d4808bfead6a438b854c73d49686/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Feb 23 02:51:59 localhost podman[53257]: 2026-02-23 07:51:59.512000794 +0000 UTC m=+0.152361119 container init 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, version=17.1.13, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-neutron-server, container_name=container-puppet-neutron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:57:35Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:57:35Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-server-container) Feb 23 02:51:59 localhost podman[53257]: 2026-02-23 07:51:59.520404955 +0000 UTC m=+0.160765280 container start 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, release=1766032510, config_id=tripleo_puppet_step1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-server-container, build-date=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp-rhel9/openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, container_name=container-puppet-neutron, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.created=2026-01-12T22:57:35Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 02:51:59 localhost podman[53257]: 2026-02-23 07:51:59.520695414 +0000 UTC m=+0.161055779 container attach 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, tcib_managed=true, container_name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, build-date=2026-01-12T22:57:35Z, summary=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, config_id=tripleo_puppet_step1, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Feb 23 02:51:59 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Feb 23 02:52:00 localhost puppet-user[51806]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3ccd56cc76ec60fa08fd698d282c9c89b1e8c485a00f47d57569ed8f6f8a16e4' Feb 23 02:52:00 localhost puppet-user[51806]: Notice: Applied catalog in 4.54 seconds Feb 23 02:52:00 localhost puppet-user[51806]: Application: Feb 23 02:52:00 localhost puppet-user[51806]: Initial environment: production Feb 23 02:52:00 localhost puppet-user[51806]: Converged environment: production Feb 23 02:52:00 localhost puppet-user[51806]: Run mode: user Feb 23 02:52:00 localhost puppet-user[51806]: Changes: Feb 23 02:52:00 localhost puppet-user[51806]: Total: 183 Feb 23 02:52:00 localhost puppet-user[51806]: Events: Feb 23 02:52:00 localhost puppet-user[51806]: Success: 183 Feb 23 02:52:00 localhost puppet-user[51806]: Total: 183 Feb 23 02:52:00 localhost puppet-user[51806]: Resources: Feb 23 02:52:00 localhost puppet-user[51806]: Changed: 183 Feb 23 02:52:00 localhost puppet-user[51806]: Out of sync: 183 Feb 23 02:52:00 localhost puppet-user[51806]: Skipped: 57 Feb 23 02:52:00 localhost puppet-user[51806]: Total: 487 Feb 23 02:52:00 localhost puppet-user[51806]: Time: Feb 23 02:52:00 localhost puppet-user[51806]: Concat fragment: 0.00 Feb 23 02:52:00 localhost puppet-user[51806]: Anchor: 0.00 Feb 23 02:52:00 localhost puppet-user[51806]: File line: 0.00 Feb 23 02:52:00 localhost puppet-user[51806]: Virtlogd config: 0.00 Feb 23 02:52:00 localhost puppet-user[51806]: Virtsecretd config: 0.01 Feb 23 02:52:00 localhost puppet-user[51806]: Virtstoraged config: 0.02 Feb 23 02:52:00 localhost puppet-user[51806]: Exec: 0.02 Feb 23 02:52:00 localhost puppet-user[51806]: File: 0.02 Feb 23 02:52:00 localhost puppet-user[51806]: Virtproxyd config: 0.03 Feb 23 02:52:00 localhost puppet-user[51806]: Package: 0.03 Feb 23 02:52:00 localhost puppet-user[51806]: Virtqemud config: 0.03 Feb 23 02:52:00 localhost puppet-user[51806]: Virtnodedevd config: 0.04 Feb 23 02:52:00 localhost puppet-user[51806]: Augeas: 0.95 Feb 23 02:52:00 localhost puppet-user[51806]: Config retrieval: 1.50 Feb 23 02:52:00 localhost puppet-user[51806]: Last run: 1771833120 Feb 23 02:52:00 localhost puppet-user[51806]: Nova config: 3.16 Feb 23 02:52:00 localhost puppet-user[51806]: Transaction evaluation: 4.50 Feb 23 02:52:00 localhost puppet-user[51806]: Catalog application: 4.54 Feb 23 02:52:00 localhost puppet-user[51806]: Resources: 0.00 Feb 23 02:52:00 localhost puppet-user[51806]: Concat file: 0.00 Feb 23 02:52:00 localhost puppet-user[51806]: Total: 4.55 Feb 23 02:52:00 localhost puppet-user[51806]: Version: Feb 23 02:52:00 localhost puppet-user[51806]: Config: 1771833114 Feb 23 02:52:00 localhost puppet-user[51806]: Puppet: 7.10.0 Feb 23 02:52:01 localhost systemd[1]: libpod-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757.scope: Deactivated successfully. Feb 23 02:52:01 localhost systemd[1]: libpod-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757.scope: Consumed 8.410s CPU time. Feb 23 02:52:01 localhost podman[53334]: 2026-02-23 07:52:01.195068862 +0000 UTC m=+0.034997536 container died ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:31:49Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, container_name=container-puppet-nova_libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 02:52:01 localhost puppet-user[53288]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Feb 23 02:52:01 localhost systemd[1]: tmp-crun.BztZin.mount: Deactivated successfully. Feb 23 02:52:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757-userdata-shm.mount: Deactivated successfully. Feb 23 02:52:01 localhost systemd[1]: var-lib-containers-storage-overlay-f1c03aa8e256d3d38d275b9e911c2e9e69db76da13bfae548890815046fc902e-merged.mount: Deactivated successfully. Feb 23 02:52:01 localhost podman[53334]: 2026-02-23 07:52:01.317274055 +0000 UTC m=+0.157202739 container cleanup ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, vcs-type=git, container_name=container-puppet-nova_libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, batch=17.1_20260112.1) Feb 23 02:52:01 localhost systemd[1]: libpod-conmon-ae6d349aadf9563baef5ab53f4404eb45b0b3df6e2f63b0bac1cbbfd883bf757.scope: Deactivated successfully. Feb 23 02:52:01 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 02:52:01 localhost puppet-user[53288]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:52:01 localhost puppet-user[53288]: (file: /etc/puppet/hiera.yaml) Feb 23 02:52:01 localhost puppet-user[53288]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:52:01 localhost puppet-user[53288]: (file & line not available) Feb 23 02:52:01 localhost puppet-user[53288]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:52:01 localhost puppet-user[53288]: (file & line not available) Feb 23 02:52:01 localhost puppet-user[53288]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Feb 23 02:52:02 localhost puppet-user[53288]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.58 seconds Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Feb 23 02:52:02 localhost puppet-user[53288]: Notice: Applied catalog in 0.41 seconds Feb 23 02:52:02 localhost puppet-user[53288]: Application: Feb 23 02:52:02 localhost puppet-user[53288]: Initial environment: production Feb 23 02:52:02 localhost puppet-user[53288]: Converged environment: production Feb 23 02:52:02 localhost puppet-user[53288]: Run mode: user Feb 23 02:52:02 localhost puppet-user[53288]: Changes: Feb 23 02:52:02 localhost puppet-user[53288]: Total: 33 Feb 23 02:52:02 localhost puppet-user[53288]: Events: Feb 23 02:52:02 localhost puppet-user[53288]: Success: 33 Feb 23 02:52:02 localhost puppet-user[53288]: Total: 33 Feb 23 02:52:02 localhost puppet-user[53288]: Resources: Feb 23 02:52:02 localhost puppet-user[53288]: Skipped: 21 Feb 23 02:52:02 localhost puppet-user[53288]: Changed: 33 Feb 23 02:52:02 localhost puppet-user[53288]: Out of sync: 33 Feb 23 02:52:02 localhost puppet-user[53288]: Total: 155 Feb 23 02:52:02 localhost puppet-user[53288]: Time: Feb 23 02:52:02 localhost puppet-user[53288]: Resources: 0.00 Feb 23 02:52:02 localhost puppet-user[53288]: Ovn metadata agent config: 0.02 Feb 23 02:52:02 localhost puppet-user[53288]: Neutron config: 0.32 Feb 23 02:52:02 localhost puppet-user[53288]: Transaction evaluation: 0.40 Feb 23 02:52:02 localhost puppet-user[53288]: Catalog application: 0.41 Feb 23 02:52:02 localhost puppet-user[53288]: Config retrieval: 0.65 Feb 23 02:52:02 localhost puppet-user[53288]: Last run: 1771833122 Feb 23 02:52:02 localhost puppet-user[53288]: Total: 0.41 Feb 23 02:52:02 localhost puppet-user[53288]: Version: Feb 23 02:52:02 localhost puppet-user[53288]: Config: 1771833121 Feb 23 02:52:02 localhost puppet-user[53288]: Puppet: 7.10.0 Feb 23 02:52:02 localhost systemd[1]: libpod-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59.scope: Deactivated successfully. Feb 23 02:52:02 localhost systemd[1]: libpod-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59.scope: Consumed 3.441s CPU time. Feb 23 02:52:03 localhost podman[53257]: 2026-02-23 07:52:03.00313662 +0000 UTC m=+3.643496975 container died 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, summary=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, container_name=container-puppet-neutron, io.openshift.expose-services=, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-neutron-server, build-date=2026-01-12T22:57:35Z, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.13, maintainer=OpenStack TripleO Team) Feb 23 02:52:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59-userdata-shm.mount: Deactivated successfully. Feb 23 02:52:03 localhost systemd[1]: var-lib-containers-storage-overlay-40ec17e1ee8e7a0751fc146049cedcb53f05d4808bfead6a438b854c73d49686-merged.mount: Deactivated successfully. Feb 23 02:52:03 localhost podman[53471]: 2026-02-23 07:52:03.132759402 +0000 UTC m=+0.116286309 container cleanup 40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, com.redhat.component=openstack-neutron-server-container, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=container-puppet-neutron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:57:35Z, description=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:57:35Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, name=rhosp-rhel9/openstack-neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.13, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 02:52:03 localhost systemd[1]: libpod-conmon-40aa3a549aedee591e3aede6855bf29a6727a6d1aa5622c36500db7d39298f59.scope: Deactivated successfully. Feb 23 02:52:03 localhost python3[51466]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005626463 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005626463', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Feb 23 02:52:04 localhost python3[53525]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:05 localhost python3[53557]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:52:05 localhost python3[53607]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:52:06 localhost python3[53650]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833125.2457435-84994-250329414396068/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:06 localhost python3[53712]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:52:06 localhost python3[53755]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833126.192573-84994-228560224324970/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:07 localhost python3[53817]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:52:07 localhost python3[53860]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833127.080182-85042-154961231096837/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:08 localhost python3[53922]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:52:08 localhost python3[53965]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833127.9265833-85065-71420798557053/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:09 localhost python3[53995]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:52:09 localhost systemd[1]: Reloading. Feb 23 02:52:09 localhost systemd-sysv-generator[54020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:52:09 localhost systemd-rc-local-generator[54014]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:52:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:52:09 localhost systemd[1]: Reloading. Feb 23 02:52:09 localhost systemd-rc-local-generator[54057]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:52:09 localhost systemd-sysv-generator[54062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:52:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:52:09 localhost systemd[1]: Starting TripleO Container Shutdown... Feb 23 02:52:09 localhost systemd[1]: Finished TripleO Container Shutdown. Feb 23 02:52:10 localhost python3[54117]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:52:10 localhost python3[54160]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833129.831725-85146-209533956272378/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:11 localhost python3[54222]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:52:11 localhost python3[54265]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833130.6995149-85160-67469161869531/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:11 localhost python3[54295]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:52:11 localhost systemd[1]: Reloading. Feb 23 02:52:12 localhost systemd-rc-local-generator[54319]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:52:12 localhost systemd-sysv-generator[54322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:52:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:52:12 localhost systemd[1]: Reloading. Feb 23 02:52:12 localhost systemd-rc-local-generator[54358]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:52:12 localhost systemd-sysv-generator[54361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:52:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:52:12 localhost systemd[1]: Starting Create netns directory... Feb 23 02:52:12 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 02:52:12 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 02:52:12 localhost systemd[1]: Finished Create netns directory. Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 90a8871bd317528138d212bd0375f6aa Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: da9a0dc7b40588672419e3ce10063e21 Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 45772c82d00b8348e0440509154d74a9 Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 8e5028e38f7077561ef1e3e50ec174a3 Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 44281c742f88411d75916a4e58499720 Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 44281c742f88411d75916a4e58499720 Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: cf62475d9880911ecf982eff6ab572ad Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:12 localhost python3[54388]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: b5f04eda8e5f004a5ff6ec948b25cc1e Feb 23 02:52:14 localhost python3[54446]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 23 02:52:14 localhost podman[54483]: 2026-02-23 07:52:14.876395215 +0000 UTC m=+0.088663072 container create 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, container_name=metrics_qdr_init_logs, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=) Feb 23 02:52:14 localhost systemd[1]: Started libpod-conmon-1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f.scope. Feb 23 02:52:14 localhost podman[54483]: 2026-02-23 07:52:14.834874307 +0000 UTC m=+0.047142194 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 23 02:52:14 localhost systemd[1]: Started libcrun container. Feb 23 02:52:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb85f8b1f9f1bb7644ed891399fb297bad9a6f983f4d7e10e6f8474d89d107e3/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 23 02:52:14 localhost podman[54483]: 2026-02-23 07:52:14.952360992 +0000 UTC m=+0.164628849 container init 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, build-date=2026-01-12T22:10:14Z, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 02:52:14 localhost podman[54483]: 2026-02-23 07:52:14.962972112 +0000 UTC m=+0.175239969 container start 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr_init_logs, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 02:52:14 localhost podman[54483]: 2026-02-23 07:52:14.963273431 +0000 UTC m=+0.175541338 container attach 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, distribution-scope=public, container_name=metrics_qdr_init_logs, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 02:52:14 localhost systemd[1]: libpod-1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f.scope: Deactivated successfully. Feb 23 02:52:14 localhost podman[54483]: 2026-02-23 07:52:14.970736973 +0000 UTC m=+0.183004810 container died 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr_init_logs, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 02:52:15 localhost podman[54502]: 2026-02-23 07:52:15.061609063 +0000 UTC m=+0.077287049 container cleanup 1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr_init_logs, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc.) Feb 23 02:52:15 localhost systemd[1]: libpod-conmon-1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f.scope: Deactivated successfully. Feb 23 02:52:15 localhost python3[54446]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Feb 23 02:52:15 localhost podman[54579]: 2026-02-23 07:52:15.55293999 +0000 UTC m=+0.080141888 container create f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, managed_by=tripleo_ansible, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true) Feb 23 02:52:15 localhost systemd[1]: Started libpod-conmon-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.scope. Feb 23 02:52:15 localhost systemd[1]: Started libcrun container. Feb 23 02:52:15 localhost podman[54579]: 2026-02-23 07:52:15.512143544 +0000 UTC m=+0.039345512 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 23 02:52:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 23 02:52:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Feb 23 02:52:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:52:15 localhost podman[54579]: 2026-02-23 07:52:15.640983412 +0000 UTC m=+0.168185310 container init f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, build-date=2026-01-12T22:10:14Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible) Feb 23 02:52:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:52:15 localhost podman[54579]: 2026-02-23 07:52:15.675551294 +0000 UTC m=+0.202753202 container start f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, release=1766032510, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 02:52:15 localhost python3[54446]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=90a8871bd317528138d212bd0375f6aa --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Feb 23 02:52:15 localhost podman[54601]: 2026-02-23 07:52:15.765760694 +0000 UTC m=+0.081054536 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, managed_by=tripleo_ansible) Feb 23 02:52:15 localhost systemd[1]: var-lib-containers-storage-overlay-eb85f8b1f9f1bb7644ed891399fb297bad9a6f983f4d7e10e6f8474d89d107e3-merged.mount: Deactivated successfully. Feb 23 02:52:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1521039cbc1f0b454f347c398b2f9d9f82e9d5fd05702e1b7ff5d0597e25598f-userdata-shm.mount: Deactivated successfully. Feb 23 02:52:15 localhost podman[54601]: 2026-02-23 07:52:15.995198613 +0000 UTC m=+0.310492465 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510) Feb 23 02:52:16 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:52:16 localhost python3[54676]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:16 localhost python3[54692]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:52:17 localhost python3[54753]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833136.6384125-85348-131987307837601/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:17 localhost python3[54769]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 02:52:17 localhost systemd[1]: Reloading. Feb 23 02:52:17 localhost systemd-rc-local-generator[54791]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:52:17 localhost systemd-sysv-generator[54797]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:52:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:52:18 localhost sshd[54806]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:52:18 localhost python3[54823]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:52:18 localhost systemd[1]: Reloading. Feb 23 02:52:18 localhost systemd-sysv-generator[54851]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:52:18 localhost systemd-rc-local-generator[54847]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:52:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:52:18 localhost systemd[1]: Starting metrics_qdr container... Feb 23 02:52:18 localhost systemd[1]: Started metrics_qdr container. Feb 23 02:52:19 localhost python3[54902]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:20 localhost python3[55023]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005626463 step=1 update_config_hash_only=False Feb 23 02:52:21 localhost python3[55039]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:52:21 localhost python3[55055]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 23 02:52:40 localhost sshd[55056]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:52:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:52:46 localhost systemd[1]: tmp-crun.iVbmdT.mount: Deactivated successfully. Feb 23 02:52:46 localhost podman[55058]: 2026-02-23 07:52:46.930019352 +0000 UTC m=+0.098481529 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, version=17.1.13, batch=17.1_20260112.1, container_name=metrics_qdr) Feb 23 02:52:47 localhost podman[55058]: 2026-02-23 07:52:47.165247212 +0000 UTC m=+0.333709419 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 02:52:47 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:52:58 localhost sshd[55165]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:53:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:53:17 localhost podman[55167]: 2026-02-23 07:53:17.920229013 +0000 UTC m=+0.091409036 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 02:53:18 localhost podman[55167]: 2026-02-23 07:53:18.114964232 +0000 UTC m=+0.286144295 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Feb 23 02:53:18 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:53:39 localhost sshd[55196]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:53:40 localhost sshd[55198]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:53:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:53:48 localhost systemd[1]: tmp-crun.5zsS08.mount: Deactivated successfully. Feb 23 02:53:48 localhost podman[55200]: 2026-02-23 07:53:48.910436471 +0000 UTC m=+0.086378874 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Feb 23 02:53:49 localhost podman[55200]: 2026-02-23 07:53:49.125115048 +0000 UTC m=+0.301057411 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com) Feb 23 02:53:49 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:54:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:54:19 localhost systemd[1]: tmp-crun.kR7Ivp.mount: Deactivated successfully. Feb 23 02:54:19 localhost podman[55307]: 2026-02-23 07:54:19.91578942 +0000 UTC m=+0.091147202 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, com.redhat.component=openstack-qdrouterd-container) Feb 23 02:54:20 localhost podman[55307]: 2026-02-23 07:54:20.123338095 +0000 UTC m=+0.298695887 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Feb 23 02:54:20 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:54:20 localhost sshd[55336]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:54:36 localhost sshd[55338]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:54:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:54:50 localhost systemd[1]: tmp-crun.CwkCtg.mount: Deactivated successfully. Feb 23 02:54:50 localhost podman[55340]: 2026-02-23 07:54:50.908167573 +0000 UTC m=+0.082886835 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Feb 23 02:54:51 localhost podman[55340]: 2026-02-23 07:54:51.122596946 +0000 UTC m=+0.297316188 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1) Feb 23 02:54:51 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:55:02 localhost sshd[55445]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:55:19 localhost sshd[55447]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:55:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:55:21 localhost podman[55449]: 2026-02-23 07:55:21.364617151 +0000 UTC m=+0.074052225 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, vendor=Red Hat, Inc.) Feb 23 02:55:21 localhost podman[55449]: 2026-02-23 07:55:21.588482762 +0000 UTC m=+0.297917796 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, distribution-scope=public) Feb 23 02:55:21 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:55:31 localhost sshd[55478]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:55:44 localhost sshd[55480]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:55:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:55:51 localhost systemd[1]: tmp-crun.QrwgpO.mount: Deactivated successfully. Feb 23 02:55:51 localhost podman[55482]: 2026-02-23 07:55:51.908510504 +0000 UTC m=+0.085814295 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1) Feb 23 02:55:52 localhost podman[55482]: 2026-02-23 07:55:52.105337781 +0000 UTC m=+0.282641612 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 02:55:52 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:56:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:56:22 localhost systemd[1]: tmp-crun.w3okcp.mount: Deactivated successfully. Feb 23 02:56:22 localhost podman[55589]: 2026-02-23 07:56:22.913600585 +0000 UTC m=+0.084202375 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 02:56:23 localhost podman[55589]: 2026-02-23 07:56:23.145064013 +0000 UTC m=+0.315665763 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 02:56:23 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:56:25 localhost sshd[55618]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:56:27 localhost sshd[55620]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:56:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:56:53 localhost podman[55622]: 2026-02-23 07:56:53.902347499 +0000 UTC m=+0.080881762 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 23 02:56:54 localhost podman[55622]: 2026-02-23 07:56:54.089437915 +0000 UTC m=+0.267972138 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1) Feb 23 02:56:54 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:57:06 localhost sshd[55729]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:57:22 localhost ceph-osd[32575]: osd.5 pg_epoch: 20 pg[2.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [3,5,1] r=1 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:23 localhost ceph-osd[32575]: osd.5 pg_epoch: 22 pg[3.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,3,1] r=0 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:24 localhost ceph-osd[32575]: osd.5 pg_epoch: 23 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [5,3,1] r=0 lpr=22 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:24 localhost sshd[55731]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:57:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:57:24 localhost podman[55732]: 2026-02-23 07:57:24.906289379 +0000 UTC m=+0.079972914 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, version=17.1.13, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public) Feb 23 02:57:25 localhost podman[55732]: 2026-02-23 07:57:25.124535559 +0000 UTC m=+0.298219134 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, release=1766032510, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public) Feb 23 02:57:25 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:57:26 localhost ceph-osd[32575]: osd.5 pg_epoch: 24 pg[4.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [4,0,5] r=2 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:27 localhost ceph-osd[31633]: osd.2 pg_epoch: 26 pg[5.0( empty local-lis/les=0/0 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,4,3] r=0 lpr=26 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 27 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=27 pruub=12.108013153s) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active pruub 1113.961425781s@ mbc={}] start_peering_interval up [5,3,1] -> [5,3,1], acting [5,3,1] -> [5,3,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 27 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=27 pruub=12.108013153s) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown pruub 1113.961425781s@ mbc={}] state: transitioning to Primary Feb 23 02:57:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 27 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=27 pruub=9.998037338s) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 active pruub 1111.856933594s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,1], acting [3,5,1] -> [3,5,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 27 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=27 pruub=9.994277954s) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1111.856933594s@ mbc={}] state: transitioning to Stray Feb 23 02:57:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 27 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [2,4,3] r=0 lpr=26 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.19( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.19( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.18( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.16( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.17( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.16( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.17( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.18( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.14( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.15( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.15( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.12( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.13( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.12( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.11( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.10( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.10( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.11( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.f( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.e( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.13( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.f( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.e( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.14( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.d( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.c( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.a( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.c( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.d( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.a( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.b( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.3( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.2( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.b( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.2( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.7( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.6( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.3( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.4( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.5( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.4( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.5( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.8( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.6( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.7( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.8( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.9( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.9( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1b( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1a( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1a( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1b( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1d( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1c( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1c( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1d( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1f( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1e( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[2.1f( empty local-lis/les=20/21 n=0 ec=27/20 lis/c=20/20 les/c/f=21/21/0 sis=27) [3,5,1] r=1 lpr=27 pi=[20,27)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1e( empty local-lis/les=22/23 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.0( empty local-lis/les=27/28 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.19( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.14( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.17( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.18( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.15( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.13( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.12( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.10( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.11( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.2( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.3( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.6( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.16( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.5( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.7( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.4( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.9( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.8( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 28 pg[3.1e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=22/22 les/c/f=23/23/0 sis=27) [5,3,1] r=0 lpr=27 pi=[22,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:32 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.0 scrub starts Feb 23 02:57:32 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.0 scrub ok Feb 23 02:57:33 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.19 scrub starts Feb 23 02:57:33 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.19 scrub ok Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.19( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933093071s) [4,0,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.913330078s@ mbc={}] start_peering_interval up [3,5,1] -> [4,0,2], acting [3,5,1] -> [4,0,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.19( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933013916s) [4,0,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.913330078s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.18( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924542427s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905029297s@ mbc={}] start_peering_interval up [5,3,1] -> [4,2,3], acting [5,3,1] -> [4,2,3], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.18( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924520493s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905029297s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.16( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935929298s) [0,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.916870117s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,1], acting [3,5,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.16( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.927214622s) [3,1,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.908081055s@ mbc={}] start_peering_interval up [5,3,1] -> [3,1,5], acting [5,3,1] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.16( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935905457s) [0,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.916870117s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.19( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923992157s) [1,0,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.904907227s@ mbc={}] start_peering_interval up [5,3,1] -> [1,0,2], acting [5,3,1] -> [1,0,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.17( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923963547s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905029297s@ mbc={}] start_peering_interval up [5,3,1] -> [1,3,5], acting [5,3,1] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.18( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935873985s) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.916992188s@ mbc={}] start_peering_interval up [3,5,1] -> [5,3,4], acting [3,5,1] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.17( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923938751s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905029297s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.18( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935873985s) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.916992188s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.19( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923839569s) [1,0,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.904907227s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.15( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935940742s) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917114258s@ mbc={}] start_peering_interval up [3,5,1] -> [5,1,3], acting [3,5,1] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.15( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935940742s) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.917114258s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.16( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.926757812s) [3,1,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.908081055s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.17( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935349464s) [3,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.916748047s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,4], acting [3,5,1] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.15( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923473358s) [2,3,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905151367s@ mbc={}] start_peering_interval up [5,3,1] -> [2,3,4], acting [5,3,1] -> [2,3,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.14( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923303604s) [3,2,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905029297s@ mbc={}] start_peering_interval up [5,3,1] -> [3,2,4], acting [5,3,1] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.15( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923446655s) [2,3,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905151367s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.14( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935741425s) [3,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917480469s@ mbc={}] start_peering_interval up [3,5,1] -> [3,2,1], acting [3,5,1] -> [3,2,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.14( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923200607s) [3,2,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905029297s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.12( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924057961s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906005859s@ mbc={}] start_peering_interval up [5,3,1] -> [1,3,5], acting [5,3,1] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.14( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935695648s) [3,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917480469s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.12( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924035072s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906005859s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.13( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923510551s) [3,4,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.905517578s@ mbc={}] start_peering_interval up [5,3,1] -> [3,4,2], acting [5,3,1] -> [3,4,2], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.13( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923488617s) [3,4,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.905517578s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.12( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935044289s) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917114258s@ mbc={}] start_peering_interval up [3,5,1] -> [5,1,3], acting [3,5,1] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.12( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935044289s) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.917114258s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.11( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935156822s) [0,1,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917236328s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.10( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923770905s) [3,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906005859s@ mbc={}] start_peering_interval up [5,3,1] -> [3,5,4], acting [5,3,1] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.10( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923749924s) [3,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906005859s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.11( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935102463s) [0,1,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917236328s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.10( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934917450s) [2,1,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917236328s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.10( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934895515s) [2,1,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917236328s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923784256s) [2,3,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906372070s@ mbc={}] start_peering_interval up [5,3,1] -> [2,3,1], acting [5,3,1] -> [2,3,1], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923726082s) [2,3,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906372070s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.13( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934187889s) [2,0,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.916870117s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.f( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935432434s) [2,0,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917846680s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.e( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934660912s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917480469s@ mbc={}] start_peering_interval up [3,5,1] -> [4,2,3], acting [3,5,1] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.f( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935007095s) [2,0,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917846680s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.e( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934618950s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917480469s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924374580s) [0,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907348633s@ mbc={}] start_peering_interval up [5,3,1] -> [0,5,4], acting [5,3,1] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924353600s) [0,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907348633s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923598289s) [0,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906738281s@ mbc={}] start_peering_interval up [5,3,1] -> [0,4,5], acting [5,3,1] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.d( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934491158s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917724609s@ mbc={}] start_peering_interval up [3,5,1] -> [5,0,1], acting [3,5,1] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.11( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922766685s) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906005859s@ mbc={}] start_peering_interval up [5,3,1] -> [5,0,4], acting [5,3,1] -> [5,0,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923577309s) [0,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906738281s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.d( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934491158s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.917724609s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.13( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933991432s) [2,0,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.916870117s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.11( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922766685s) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.906005859s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.b( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934110641s) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918090820s@ mbc={}] start_peering_interval up [3,5,1] -> [5,3,4], acting [3,5,1] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.b( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934110641s) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.918090820s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922724724s) [3,2,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.906860352s@ mbc={}] start_peering_interval up [5,3,1] -> [3,2,4], acting [5,3,1] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.c( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933381081s) [2,1,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917724609s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,0], acting [3,5,1] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922611237s) [3,2,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.906860352s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.c( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933361053s) [2,1,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917724609s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923115730s) [3,1,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907348633s@ mbc={}] start_peering_interval up [5,3,1] -> [3,1,5], acting [5,3,1] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.17( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933684349s) [3,5,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.916748047s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.a( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933485985s) [2,4,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.917724609s@ mbc={}] start_peering_interval up [3,5,1] -> [2,4,3], acting [3,5,1] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922876358s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907714844s@ mbc={}] start_peering_interval up [5,3,1] -> [4,0,5], acting [5,3,1] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922852516s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907714844s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.3( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933574677s) [3,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918457031s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,5], acting [3,5,1] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.3( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933526993s) [3,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918457031s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.2( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922780991s) [4,5,0] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907836914s@ mbc={}] start_peering_interval up [5,3,1] -> [4,5,0], acting [5,3,1] -> [4,5,0], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.2( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922744751s) [4,5,0] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907836914s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922776222s) [4,3,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907958984s@ mbc={}] start_peering_interval up [5,3,1] -> [4,3,2], acting [5,3,1] -> [4,3,2], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.a( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933062553s) [2,4,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.917724609s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922705650s) [4,3,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907958984s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.6( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932945251s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918334961s@ mbc={}] start_peering_interval up [3,5,1] -> [1,2,3], acting [3,5,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.6( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932924271s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918334961s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.7( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924454689s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.909912109s@ mbc={}] start_peering_interval up [5,3,1] -> [4,5,3], acting [5,3,1] -> [4,5,3], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.2( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933115005s) [3,1,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918701172s@ mbc={}] start_peering_interval up [3,5,1] -> [3,1,2], acting [3,5,1] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.2( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.933100700s) [3,1,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918701172s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.7( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924364090s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.909912109s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.3( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922254562s) [0,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907958984s@ mbc={}] start_peering_interval up [5,3,1] -> [0,4,5], acting [5,3,1] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.921727180s) [3,1,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907348633s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.3( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.921961784s) [0,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907958984s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932245255s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918334961s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932209969s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918334961s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.4( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923923492s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.910156250s@ mbc={}] start_peering_interval up [5,3,1] -> [1,2,3], acting [5,3,1] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.5( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932508469s) [2,4,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918701172s@ mbc={}] start_peering_interval up [3,5,1] -> [2,4,0], acting [3,5,1] -> [2,4,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.4( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932271957s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918334961s@ mbc={}] start_peering_interval up [3,5,1] -> [1,2,3], acting [3,5,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.5( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932435036s) [2,4,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918701172s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.4( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.931917191s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918334961s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.4( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923786163s) [1,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.910156250s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.9( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932497025s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.919067383s@ mbc={}] start_peering_interval up [3,5,1] -> [4,0,5], acting [3,5,1] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.9( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932441711s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.919067383s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.6( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.921246529s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.907958984s@ mbc={}] start_peering_interval up [5,3,1] -> [4,0,5], acting [5,3,1] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.8( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932141304s) [0,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.918945312s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,1], acting [3,5,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.6( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.921147346s) [4,0,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.907958984s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.8( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.932101250s) [0,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.918945312s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.8( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924057007s) [2,1,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.910888672s@ mbc={}] start_peering_interval up [5,3,1] -> [2,1,0], acting [5,3,1] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.7( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.936571121s) [3,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923095703s@ mbc={}] start_peering_interval up [3,5,1] -> [3,2,1], acting [3,5,1] -> [3,2,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.8( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924020767s) [2,1,0] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.910888672s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.7( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.936212540s) [3,2,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923095703s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923654556s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.910766602s@ mbc={}] start_peering_interval up [5,3,1] -> [5,0,1], acting [5,3,1] -> [5,0,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.5( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922659874s) [3,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.909423828s@ mbc={}] start_peering_interval up [5,3,1] -> [3,4,5], acting [5,3,1] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1b( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923654556s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.910766602s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924242020s) [5,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.911376953s@ mbc={}] start_peering_interval up [5,3,1] -> [5,1,0], acting [5,3,1] -> [5,1,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1b( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.931631088s) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.919067383s@ mbc={}] start_peering_interval up [3,5,1] -> [5,0,4], acting [3,5,1] -> [5,0,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1a( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924242020s) [5,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.911376953s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1b( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.931631088s) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.919067383s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.5( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.922058105s) [3,4,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.909423828s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924921036s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.912475586s@ mbc={}] start_peering_interval up [5,3,1] -> [5,0,1], acting [5,3,1] -> [5,0,1], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1d( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924921036s) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown pruub 1118.912475586s@ mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1c( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935391426s) [2,3,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923095703s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1c( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935353279s) [2,3,1] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923095703s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924047470s) [0,4,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.911987305s@ mbc={}] start_peering_interval up [5,3,1] -> [0,4,2], acting [5,3,1] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1c( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.924012184s) [0,4,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.911987305s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1d( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935465813s) [2,3,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923339844s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,4], acting [3,5,1] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1d( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935432434s) [2,3,4] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923339844s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923787117s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.911987305s@ mbc={}] start_peering_interval up [5,3,1] -> [1,3,5], acting [5,3,1] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1e( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935186386s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923339844s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1f( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923679352s) [1,3,5] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.911987305s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1e( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.935120583s) [4,5,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923339844s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1f( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934896469s) [4,3,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.923339844s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[2.1f( empty local-lis/les=27/28 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.934829712s) [4,3,2] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.923339844s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923625946s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active pruub 1118.912475586s@ mbc={}] start_peering_interval up [5,3,1] -> [4,2,3], acting [5,3,1] -> [4,2,3], acting_primary 5 -> 4, up_primary 5 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:57:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 31 pg[3.1e( empty local-lis/les=27/28 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31 pruub=10.923505783s) [4,2,3] r=-1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1118.912475586s@ mbc={}] state: transitioning to Stray Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.8( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.a( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,4,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.c( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.1c( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.13( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.f( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.e( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.5( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,4,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.15( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.10( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.1d( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.8( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [0,2,1] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.11( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [0,1,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.16( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [0,2,1] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.1c( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [0,4,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.e( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.18( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.19( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,0,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.1( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,3,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.1f( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,3,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.1e( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [4,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.2( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,1,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.d( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,2,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.13( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,4,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.14( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,2,4] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.14( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,2,1] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.7( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [3,2,1] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.6( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [1,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[2.4( empty local-lis/les=0/0 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [1,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.4( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [1,2,3] r=1 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.1d( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 31 pg[3.19( empty local-lis/les=0/0 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [1,0,2] r=2 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.b( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[3.11( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.1b( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[3.1d( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.d( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[3.1b( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,0,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[3.1a( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.12( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.15( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 32 pg[2.18( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [5,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.13( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.c( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.5( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,4,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[3.8( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,0] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.a( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,4,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.f( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,0,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.10( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,1,3] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[3.15( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,4] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[2.1c( empty local-lis/les=31/32 n=0 ec=27/20 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 32 pg[3.e( empty local-lis/les=31/32 n=0 ec=27/22 lis/c=27/27 les/c/f=28/28/0 sis=31) [2,3,1] r=0 lpr=31 pi=[27,31)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:57:36 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.e scrub starts Feb 23 02:57:38 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.9 scrub starts Feb 23 02:57:39 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.e scrub ok Feb 23 02:57:39 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.a scrub starts Feb 23 02:57:39 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.a scrub ok Feb 23 02:57:40 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.d scrub starts Feb 23 02:57:40 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.d scrub ok Feb 23 02:57:40 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.15 scrub starts Feb 23 02:57:40 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.15 scrub ok Feb 23 02:57:41 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.c scrub starts Feb 23 02:57:41 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.c scrub ok Feb 23 02:57:44 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.8 scrub starts Feb 23 02:57:44 localhost ceph-osd[31633]: osd.2 pg_epoch: 33 pg[6.0( empty local-lis/les=0/0 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [4,0,2] r=2 lpr=33 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:44 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 3.8 scrub ok Feb 23 02:57:45 localhost ceph-osd[32575]: osd.5 pg_epoch: 34 pg[7.0( empty local-lis/les=0/0 n=0 ec=34/34 lis/c=0/0 les/c/f=0/0/0 sis=34) [0,5,4] r=1 lpr=34 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:57:45 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.12 scrub starts Feb 23 02:57:45 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.12 scrub ok Feb 23 02:57:47 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1d scrub starts Feb 23 02:57:47 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1d scrub ok Feb 23 02:57:47 localhost sshd[55764]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:57:51 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.15 scrub starts Feb 23 02:57:51 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.15 scrub ok Feb 23 02:57:52 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.f scrub starts Feb 23 02:57:52 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.f scrub ok Feb 23 02:57:54 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1a deep-scrub starts Feb 23 02:57:54 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1a deep-scrub ok Feb 23 02:57:54 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.10 scrub starts Feb 23 02:57:54 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.10 scrub ok Feb 23 02:57:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:57:56 localhost podman[55811]: 2026-02-23 07:57:56.312204291 +0000 UTC m=+0.090153830 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, config_id=tripleo_step1, vcs-type=git, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Feb 23 02:57:56 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.b scrub starts Feb 23 02:57:56 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.b scrub ok Feb 23 02:57:56 localhost podman[55811]: 2026-02-23 07:57:56.531525135 +0000 UTC m=+0.309474734 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, container_name=metrics_qdr, architecture=x86_64, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 02:57:56 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:57:57 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.13 scrub starts Feb 23 02:57:57 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.13 scrub ok Feb 23 02:57:58 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.18 deep-scrub starts Feb 23 02:57:58 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.18 deep-scrub ok Feb 23 02:57:59 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.11 scrub starts Feb 23 02:57:59 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.11 scrub ok Feb 23 02:58:00 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.5 deep-scrub starts Feb 23 02:58:00 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.5 deep-scrub ok Feb 23 02:58:01 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.1c scrub starts Feb 23 02:58:01 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.1c scrub ok Feb 23 02:58:03 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.1d scrub starts Feb 23 02:58:03 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 2.1d scrub ok Feb 23 02:58:05 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.1b scrub starts Feb 23 02:58:05 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 2.1b scrub ok Feb 23 02:58:06 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1b scrub starts Feb 23 02:58:06 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.1b scrub ok Feb 23 02:58:07 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.9 scrub starts Feb 23 02:58:07 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 3.9 scrub ok Feb 23 02:58:18 localhost python3[55855]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:58:20 localhost python3[55871]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:58:22 localhost python3[55887]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:58:23 localhost sshd[55888]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:58:25 localhost python3[55936]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:58:25 localhost python3[55979]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833504.8338923-92678-100508287553593/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=bb97f2335ebfccbfb2bd8d50bbb589ce7e034c5d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:58:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:58:26 localhost systemd[1]: tmp-crun.WQt1PZ.mount: Deactivated successfully. Feb 23 02:58:26 localhost podman[55994]: 2026-02-23 07:58:26.904970549 +0000 UTC m=+0.081157908 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 23 02:58:27 localhost podman[55994]: 2026-02-23 07:58:27.106402048 +0000 UTC m=+0.282589387 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible) Feb 23 02:58:27 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:58:27 localhost ceph-osd[32575]: osd.5 pg_epoch: 39 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=39 pruub=10.236914635s) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 active pruub 1171.792724609s@ mbc={}] start_peering_interval up [4,0,5] -> [4,0,5], acting [4,0,5] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:27 localhost ceph-osd[32575]: osd.5 pg_epoch: 39 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=39 pruub=10.234679222s) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1171.792724609s@ mbc={}] state: transitioning to Stray Feb 23 02:58:27 localhost ceph-osd[31633]: osd.2 pg_epoch: 39 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=39 pruub=12.557024956s) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active pruub 1178.992553711s@ mbc={}] start_peering_interval up [2,4,3] -> [2,4,3], acting [2,4,3] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:27 localhost sshd[56024]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:58:27 localhost ceph-osd[31633]: osd.2 pg_epoch: 39 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=39 pruub=12.557024956s) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.992553711s@ mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1c( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1a( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.17( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1d( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.13( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.2( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1b( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.8( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.14( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.b( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.d( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.e( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.f( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.16( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1e( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.9( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.7( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.5( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.15( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.12( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.18( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1f( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.3( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.11( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.4( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.6( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.a( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.c( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.19( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.10( empty local-lis/les=26/27 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1a( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1d( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.18( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.e( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.f( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.2( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.4( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.6( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.7( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.5( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.b( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.d( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.8( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.16( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1c( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.17( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.15( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.c( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.13( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.10( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.11( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.9( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.3( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.a( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.19( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.12( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1b( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1f( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.1e( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 40 pg[4.14( empty local-lis/les=24/25 n=0 ec=39/24 lis/c=24/24 les/c/f=25/25/0 sis=39) [4,0,5] r=2 lpr=39 pi=[24,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=39/40 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.18( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.5( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.2( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.7( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.6( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.4( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.19( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.14( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.9( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.17( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.16( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.13( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.8( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.3( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.12( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1e( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.10( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.15( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.e( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.1c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.11( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:28 localhost ceph-osd[31633]: osd.2 pg_epoch: 40 pg[5.d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=26/26 les/c/f=27/27/0 sis=39) [2,4,3] r=0 lpr=39 pi=[26,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:29 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.0 deep-scrub starts Feb 23 02:58:29 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.0 deep-scrub ok Feb 23 02:58:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 41 pg[7.0( v 36'39 (0'0,36'39] local-lis/les=34/35 n=22 ec=34/34 lis/c=34/34 les/c/f=35/35/0 sis=41 pruub=11.574728966s) [0,5,4] r=1 lpr=41 pi=[34,41)/1 luod=0'0 lua=36'37 crt=36'39 lcod 36'38 mlcod 0'0 active pruub 1175.151367188s@ mbc={}] start_peering_interval up [0,5,4] -> [0,5,4], acting [0,5,4] -> [0,5,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 41 pg[7.0( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=34/34 lis/c=34/34 les/c/f=35/35/0 sis=41 pruub=11.572754860s) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 lcod 36'38 mlcod 0'0 unknown NOTIFY pruub 1175.151367188s@ mbc={}] state: transitioning to Stray Feb 23 02:58:29 localhost ceph-osd[31633]: osd.2 pg_epoch: 41 pg[6.0( empty local-lis/les=33/34 n=0 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.716135979s) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 active pruub 1179.187866211s@ mbc={}] start_peering_interval up [4,0,2] -> [4,0,2], acting [4,0,2] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:29 localhost ceph-osd[31633]: osd.2 pg_epoch: 41 pg[6.0( empty local-lis/les=33/34 n=0 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=41 pruub=10.713320732s) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1179.187866211s@ mbc={}] state: transitioning to Stray Feb 23 02:58:30 localhost python3[56073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:58:30 localhost python3[56116]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833510.1481686-92678-160076057041176/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=04bfb06bbb9d2445e353d8ca8467b47fb8316e81 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.2( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.d( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.7( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.3( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.4( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.5( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.f( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.e( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.8( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.c( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.b( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.6( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=2 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.a( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[32575]: osd.5 pg_epoch: 42 pg[7.9( v 36'39 lc 0'0 (0'0,36'39] local-lis/les=34/35 n=1 ec=41/34 lis/c=34/34 les/c/f=35/35/0 sis=41) [0,5,4] r=1 lpr=41 pi=[34,41)/1 crt=36'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1a( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.13( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.5( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.7( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1c( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.2( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.11( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.16( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.6( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1d( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1b( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.f( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.a( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.12( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.15( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.9( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.d( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.c( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.e( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.8( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.18( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.b( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.17( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.10( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.14( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.3( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.19( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.4( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1f( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:31 localhost ceph-osd[31633]: osd.2 pg_epoch: 42 pg[6.1e( empty local-lis/les=33/34 n=0 ec=41/33 lis/c=33/33 les/c/f=34/34/0 sis=41) [4,0,2] r=2 lpr=41 pi=[33,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1e( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828355789s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764892578s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.f( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637487411s) [1,2,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573974609s@ mbc={}] start_peering_interval up [4,0,5] -> [1,2,3], acting [4,0,5] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828296661s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764892578s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.f( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637428284s) [1,2,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573974609s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.2( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.5( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.835462570s) [0,2,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642700195s@ mbc={}] start_peering_interval up [4,0,2] -> [0,2,1], acting [4,0,2] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827306747s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764160156s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.1( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827263832s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764160156s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.6( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640380859s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577514648s@ mbc={}] start_peering_interval up [4,0,5] -> [5,1,0], acting [4,0,5] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.4( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640395164s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577514648s@ mbc={}] start_peering_interval up [4,0,5] -> [1,5,3], acting [4,0,5] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.6( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640380859s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.577514648s@ mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.4( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640355110s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577514648s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.c( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640630722s) [0,4,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.578002930s@ mbc={}] start_peering_interval up [4,0,5] -> [0,4,2], acting [4,0,5] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826497078s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.763916016s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.c( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640596390s) [0,4,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.578002930s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.7( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.834444046s) [3,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642822266s@ mbc={}] start_peering_interval up [4,0,2] -> [3,1,2], acting [4,0,2] -> [3,1,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826431274s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.763916016s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828287125s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.765869141s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826339722s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.763916016s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828247070s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.765869141s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826306343s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.763916016s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.2( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.835981369s) [3,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644531250s@ mbc={}] start_peering_interval up [4,0,2] -> [3,4,2], acting [4,0,2] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.8( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.643124580s) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580932617s@ mbc={}] start_peering_interval up [4,0,5] -> [5,3,4], acting [4,0,5] -> [5,3,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.8( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.643124580s) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.580932617s@ mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826300621s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764160156s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828134537s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.766235352s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828027725s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.766235352s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826264381s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764160156s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.17( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639589310s) [1,3,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577880859s@ mbc={}] start_peering_interval up [4,0,5] -> [1,3,5], acting [4,0,5] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.2( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.835575104s) [3,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644531250s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.17( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639551163s) [1,3,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577880859s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.5( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.833481789s) [0,2,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642700195s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.15( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.641688347s) [5,4,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580200195s@ mbc={}] start_peering_interval up [4,0,5] -> [5,4,0], acting [4,0,5] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.15( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.641688347s) [5,4,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.580200195s@ mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.14( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.646197319s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584716797s@ mbc={}] start_peering_interval up [4,0,5] -> [5,1,0], acting [4,0,5] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.14( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.646197319s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.584716797s@ mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.7( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.833578110s) [3,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642822266s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.17( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,4,0] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.b( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.641355515s) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580322266s@ mbc={}] start_peering_interval up [4,0,5] -> [1,0,5], acting [4,0,5] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.b( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.641232491s) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580322266s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.19( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.13( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640675545s) [0,2,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580078125s@ mbc={}] start_peering_interval up [4,0,5] -> [0,2,1], acting [4,0,5] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.10( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640784264s) [1,2,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580200195s@ mbc={}] start_peering_interval up [4,0,5] -> [1,2,3], acting [4,0,5] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.13( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640604973s) [0,2,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580078125s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.10( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640713692s) [1,2,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580200195s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.12( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,0,1] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.1f( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1e( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.644660950s) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584716797s@ mbc={}] start_peering_interval up [4,0,5] -> [1,0,5], acting [4,0,5] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.a( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832584381s) [0,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [0,4,2], acting [4,0,2] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1e( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.644611359s) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.584716797s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.a( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832528114s) [0,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1c( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,1,3] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.c( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832357407s) [1,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [1,0,5], acting [4,0,2] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.c( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832250595s) [1,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.9( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639661789s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580322266s@ mbc={}] start_peering_interval up [4,0,5] -> [5,1,0], acting [4,0,5] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.9( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639661789s) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1178.580322266s@ mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.a( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639704704s) [0,1,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580566406s@ mbc={}] start_peering_interval up [4,0,5] -> [0,1,2], acting [4,0,5] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.d( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832991600s) [0,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644531250s@ mbc={}] start_peering_interval up [4,0,2] -> [0,4,2], acting [4,0,2] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.d( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.832948685s) [0,4,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644531250s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.a( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639678001s) [0,1,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580566406s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.823955536s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.765014648s@ mbc={}] start_peering_interval up [0,5,4] -> [0,2,4], acting [0,5,4] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.823917389s) [0,2,4] r=-1 lpr=43 pi=[41,43)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.765014648s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.e( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831695557s) [3,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643676758s@ mbc={}] start_peering_interval up [4,0,2] -> [3,1,2], acting [4,0,2] -> [3,1,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.e( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831587791s) [3,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643676758s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1b( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,3,1] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.b( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831339836s) [1,3,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643676758s@ mbc={}] start_peering_interval up [4,0,2] -> [1,3,2], acting [4,0,2] -> [1,3,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1f( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.643181801s) [2,3,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584716797s@ mbc={}] start_peering_interval up [4,0,5] -> [2,3,1], acting [4,0,5] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.7( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635969162s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577514648s@ mbc={}] start_peering_interval up [4,0,5] -> [1,5,3], acting [4,0,5] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.b( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831272125s) [1,3,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643676758s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.7( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635932922s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577514648s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1f( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.643082619s) [2,3,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.584716797s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831761360s) [2,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644287109s@ mbc={}] start_peering_interval up [4,0,2] -> [2,0,4], acting [4,0,2] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1b( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.642868996s) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584716797s@ mbc={}] start_peering_interval up [4,0,5] -> [3,1,5], acting [4,0,5] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.19( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639142990s) [2,1,0] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.581054688s@ mbc={}] start_peering_interval up [4,0,5] -> [2,1,0], acting [4,0,5] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1b( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.642843246s) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.584716797s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831761360s) [2,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1185.644287109s@ mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.19( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639110565s) [2,1,0] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.581054688s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.12( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.642658234s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.584594727s@ mbc={}] start_peering_interval up [4,0,5] -> [1,5,3], acting [4,0,5] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.12( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.642630577s) [1,5,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.584594727s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.3( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.638156891s) [2,3,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580322266s@ mbc={}] start_peering_interval up [4,0,5] -> [2,3,1], acting [4,0,5] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.3( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.3( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.638125420s) [2,3,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580322266s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.3( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831203461s) [0,5,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644409180s@ mbc={}] start_peering_interval up [4,0,2] -> [0,5,1], acting [4,0,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.3( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.831161499s) [0,5,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644409180s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.d( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635065079s) [3,2,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577636719s@ mbc={}] start_peering_interval up [4,0,5] -> [3,2,1], acting [4,0,5] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.5( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634901047s) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.577514648s@ mbc={}] start_peering_interval up [4,0,5] -> [3,5,1], acting [4,0,5] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.1( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,0,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.5( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634852409s) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577514648s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.d( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635020256s) [3,2,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.577636719s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.11( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637431145s) [4,0,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580322266s@ mbc={}] start_peering_interval up [4,0,5] -> [4,0,2], acting [4,0,5] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.11( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637386322s) [4,0,2] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580322266s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.2( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630867004s) [2,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573974609s@ mbc={}] start_peering_interval up [4,0,5] -> [2,3,4], acting [4,0,5] -> [2,3,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630843163s) [2,0,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573974609s@ mbc={}] start_peering_interval up [4,0,5] -> [2,0,1], acting [4,0,5] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.2( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630802155s) [2,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573974609s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630814552s) [2,0,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573974609s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.19( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.830554008s) [0,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644409180s@ mbc={}] start_peering_interval up [4,0,2] -> [0,1,2], acting [4,0,2] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.1d( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.e( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630595207s) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573852539s@ mbc={}] start_peering_interval up [4,0,5] -> [3,5,1], acting [4,0,5] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.e( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630550385s) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573852539s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1c( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634633064s) [2,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.578002930s@ mbc={}] start_peering_interval up [4,0,5] -> [2,1,3], acting [4,0,5] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1d( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.629957199s) [2,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573486328s@ mbc={}] start_peering_interval up [4,0,5] -> [2,3,4], acting [4,0,5] -> [2,3,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1c( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634598732s) [2,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.578002930s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.19( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.830190659s) [0,1,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644409180s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1d( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.629925728s) [2,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573486328s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.18( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636716843s) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.580444336s@ mbc={}] start_peering_interval up [4,0,5] -> [3,1,5], acting [4,0,5] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1a( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.629552841s) [3,4,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1178.573242188s@ mbc={}] start_peering_interval up [4,0,5] -> [3,4,5], acting [4,0,5] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.18( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636683464s) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.580444336s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[4.1a( empty local-lis/les=39/40 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.629494667s) [3,4,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1178.573242188s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.1c( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.18( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.829218864s) [4,3,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644287109s@ mbc={}] start_peering_interval up [4,0,2] -> [4,3,2], acting [4,0,2] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1e( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.829623222s) [5,0,4] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644775391s@ mbc={}] start_peering_interval up [4,0,2] -> [5,0,4], acting [4,0,2] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.18( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.829169273s) [4,3,2] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644287109s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1e( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.829527855s) [5,0,4] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644775391s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636437416s) [1,0,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452026367s@ mbc={}] start_peering_interval up [2,4,3] -> [1,0,5], acting [2,4,3] -> [1,0,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640314102s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.455932617s@ mbc={}] start_peering_interval up [2,4,3] -> [0,2,4], acting [2,4,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1f( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828975677s) [4,5,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644531250s@ mbc={}] start_peering_interval up [4,0,2] -> [4,5,0], acting [4,0,2] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.640275002s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.455932617s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636365891s) [1,0,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452026367s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636495590s) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452148438s@ mbc={}] start_peering_interval up [2,4,3] -> [2,0,4], acting [2,4,3] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1f( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828919411s) [4,5,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644531250s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636495590s) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1183.452148438s@ mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.14( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828634262s) [4,3,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644409180s@ mbc={}] start_peering_interval up [4,0,2] -> [4,3,5], acting [4,0,2] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637408257s) [4,5,0] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453247070s@ mbc={}] start_peering_interval up [2,4,3] -> [4,5,0], acting [2,4,3] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637366295s) [4,5,0] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453247070s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.10( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828404427s) [4,2,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644287109s@ mbc={}] start_peering_interval up [4,0,2] -> [4,2,3], acting [4,0,2] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.10( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828297615s) [4,2,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644287109s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.13( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637184143s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453247070s@ mbc={}] start_peering_interval up [2,4,3] -> [5,1,3], acting [2,4,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.13( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.637134552s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453247070s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636083603s) [0,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452392578s@ mbc={}] start_peering_interval up [2,4,3] -> [0,1,2], acting [2,4,3] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636673927s) [4,2,0] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453002930s@ mbc={}] start_peering_interval up [2,4,3] -> [4,2,0], acting [2,4,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636037827s) [0,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452392578s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.17( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827363014s) [5,4,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643676758s@ mbc={}] start_peering_interval up [4,0,2] -> [5,4,0], acting [4,0,2] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636640549s) [4,2,0] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453002930s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.17( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827264786s) [5,4,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643676758s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635220528s) [0,4,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.451782227s@ mbc={}] start_peering_interval up [2,4,3] -> [0,4,2], acting [2,4,3] -> [0,4,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635187149s) [0,4,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.451782227s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.8( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827570915s) [3,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644165039s@ mbc={}] start_peering_interval up [4,0,2] -> [3,2,4], acting [4,0,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.8( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.827534676s) [3,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644165039s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.14( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.828581810s) [4,3,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644409180s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635999680s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452880859s@ mbc={}] start_peering_interval up [2,4,3] -> [5,1,3], acting [2,4,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639410973s) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.456298828s@ mbc={}] start_peering_interval up [2,4,3] -> [2,0,4], acting [2,4,3] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.b( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635934830s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452880859s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.d( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.639410973s) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1183.456298828s@ mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635552406s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452514648s@ mbc={}] start_peering_interval up [2,4,3] -> [0,2,4], acting [2,4,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635453224s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452514648s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.15( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826209068s) [3,5,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [3,5,1], acting [4,0,2] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.15( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826155663s) [3,5,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635988235s) [3,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453247070s@ mbc={}] start_peering_interval up [2,4,3] -> [3,1,2], acting [2,4,3] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635951996s) [3,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453247070s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.12( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826085091s) [5,0,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [5,0,1], acting [4,0,2] -> [5,0,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.638475418s) [3,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.456054688s@ mbc={}] start_peering_interval up [2,4,3] -> [3,2,4], acting [2,4,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.12( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826013565s) [5,0,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.638434410s) [3,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.456054688s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635375977s) [0,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453002930s@ mbc={}] start_peering_interval up [2,4,3] -> [0,5,1], acting [2,4,3] -> [0,5,1], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.12( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635248184s) [0,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453002930s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636285782s) [1,3,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454345703s@ mbc={}] start_peering_interval up [2,4,3] -> [1,3,2], acting [2,4,3] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.636236191s) [1,3,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454345703s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.13( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1d( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.825163841s) [4,5,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643188477s@ mbc={}] start_peering_interval up [4,0,2] -> [4,5,3], acting [4,0,2] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634002686s) [3,1,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452392578s@ mbc={}] start_peering_interval up [2,4,3] -> [3,1,5], acting [2,4,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.4( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826061249s) [4,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644409180s@ mbc={}] start_peering_interval up [4,0,2] -> [4,0,5], acting [4,0,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633947372s) [3,1,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452392578s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.4( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.826021194s) [4,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644409180s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1b( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824807167s) [5,3,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643188477s@ mbc={}] start_peering_interval up [4,0,2] -> [5,3,1], acting [4,0,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.b( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1b( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824769974s) [5,3,1] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643188477s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.627758026s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.446411133s@ mbc={}] start_peering_interval up [2,4,3] -> [0,2,4], acting [2,4,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.627714157s) [0,2,4] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.446411133s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.6( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824810028s) [1,3,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643554688s@ mbc={}] start_peering_interval up [4,0,2] -> [1,3,5], acting [4,0,2] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1d( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824954033s) [4,5,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643188477s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633456230s) [4,3,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452270508s@ mbc={}] start_peering_interval up [2,4,3] -> [4,3,5], acting [2,4,3] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.6( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.824737549s) [1,3,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643554688s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633417130s) [4,3,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452270508s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.4( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.16( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.823819160s) [4,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642822266s@ mbc={}] start_peering_interval up [4,0,2] -> [4,0,5], acting [4,0,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.16( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.823763847s) [4,0,5] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642822266s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635361671s) [3,4,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454467773s@ mbc={}] start_peering_interval up [2,4,3] -> [3,4,5], acting [2,4,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635313988s) [3,4,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454467773s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.12( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634619713s) [5,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454101562s@ mbc={}] start_peering_interval up [2,4,3] -> [5,3,4], acting [2,4,3] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.12( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634549141s) [5,3,4] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454101562s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632687569s) [3,4,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452392578s@ mbc={}] start_peering_interval up [2,4,3] -> [3,4,5], acting [2,4,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634292603s) [4,3,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.453979492s@ mbc={}] start_peering_interval up [2,4,3] -> [4,3,2], acting [2,4,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632593155s) [3,4,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452392578s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634172440s) [4,3,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.453979492s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634132385s) [3,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454101562s@ mbc={}] start_peering_interval up [2,4,3] -> [3,5,1], acting [2,4,3] -> [3,5,1], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1c( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822861671s) [5,1,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642700195s@ mbc={}] start_peering_interval up [4,0,2] -> [5,1,3], acting [4,0,2] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.634088516s) [3,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454101562s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1c( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822648048s) [5,1,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642700195s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.631948471s) [4,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452514648s@ mbc={}] start_peering_interval up [2,4,3] -> [4,0,2], acting [2,4,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.4( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632093430s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452636719s@ mbc={}] start_peering_interval up [2,4,3] -> [5,1,3], acting [2,4,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.631814957s) [4,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452514648s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.11( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822202682s) [4,5,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642944336s@ mbc={}] start_peering_interval up [4,0,2] -> [4,5,3], acting [4,0,2] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.4( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.631904602s) [5,1,3] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452636719s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635508537s) [1,2,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.456176758s@ mbc={}] start_peering_interval up [2,4,3] -> [1,2,3], acting [2,4,3] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.635181427s) [1,2,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.456176758s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.11( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822152138s) [4,5,3] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642944336s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633783340s) [1,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.455078125s@ mbc={}] start_peering_interval up [2,4,3] -> [1,0,2], acting [2,4,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.633742332s) [1,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.455078125s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.f( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.821985245s) [4,5,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.643310547s@ mbc={}] start_peering_interval up [4,0,2] -> [4,5,0], acting [4,0,2] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.f( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.821942329s) [4,5,0] r=-1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.643310547s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.9( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822620392s) [4,2,0] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.644165039s@ mbc={}] start_peering_interval up [4,0,2] -> [4,2,0], acting [4,0,2] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.9( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.822558403s) [4,2,0] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.644165039s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632662773s) [0,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.454467773s@ mbc={}] start_peering_interval up [2,4,3] -> [0,5,1], acting [2,4,3] -> [0,5,1], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1a( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.820868492s) [3,2,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642578125s@ mbc={}] start_peering_interval up [4,0,2] -> [3,2,1], acting [4,0,2] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.13( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.821425438s) [1,2,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active pruub 1185.642944336s@ mbc={}] start_peering_interval up [4,0,2] -> [1,2,3], acting [4,0,2] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630964279s) [1,3,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active pruub 1183.452758789s@ mbc={}] start_peering_interval up [2,4,3] -> [1,3,5], acting [2,4,3] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.1a( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.820826530s) [3,2,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642578125s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[6.13( empty local-lis/les=41/42 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43 pruub=12.821215630s) [1,2,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.642944336s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.632577896s) [0,5,1] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.454467773s@ mbc={}] state: transitioning to Stray Feb 23 02:58:34 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=39/40 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43 pruub=10.630919456s) [1,3,5] r=-1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1183.452758789s@ mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.1d( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [1,0,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.c( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [1,0,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.3( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,5,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.6( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [1,3,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.1( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.19( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [1,3,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.3( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.9( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.10( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1d( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,5,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.14( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,3,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.1f( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,5,0] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.f( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,5,0] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.10( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [1,2,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.11( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,5,3] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.4( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,0,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.15( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [3,5,1] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.1f( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,5,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.d( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.15( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,4,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[6.16( empty local-lis/les=0/0 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [4,0,5] r=2 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.f( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [1,2,3] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.7( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,1,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.c( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,4,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.17( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [4,5,0] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.f( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.1( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,4,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.9( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 43 pg[5.5( empty local-lis/les=0/0 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [4,3,5] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.a( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,1,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.b( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.1e( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.7( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[7.5( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=43) [0,2,4] r=1 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.13( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [0,2,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.11( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [4,0,2] r=2 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 43 pg[4.d( empty local-lis/les=0/0 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [3,2,1] r=1 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.15( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,4,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.8( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.17( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,4,0] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[5.12( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.2( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.12( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,0,1] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[6.1( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [2,0,4] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.1d( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.9( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.1b( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,3,1] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[6.1c( empty local-lis/les=43/44 n=0 ec=41/33 lis/c=41/41 les/c/f=42/42/0 sis=43) [5,1,3] r=0 lpr=43 pi=[41,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.1f( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[5.1a( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[5.13( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.3( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,3,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.1( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,0,1] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[5.b( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.14( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[5.4( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.1c( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,1,3] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[5.d( empty local-lis/les=43/44 n=0 ec=39/26 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,0,4] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[32575]: osd.5 pg_epoch: 44 pg[4.6( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [5,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost ceph-osd[31633]: osd.2 pg_epoch: 44 pg[4.19( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=39/39 les/c/f=40/40/0 sis=43) [2,1,0] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:35 localhost python3[56179]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:58:36 localhost python3[56222]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833515.3279996-92678-225510035549802/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=b30b176c5dadfc33fbdfb5fdc77f69e2337fe39c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:58:36 localhost ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.783137321s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764770508s@ mbc={}] start_peering_interval up [0,5,4] -> [4,5,0], acting [0,5,4] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:36 localhost ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.783046722s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764770508s@ mbc={}] state: transitioning to Stray Feb 23 02:58:36 localhost ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.780872345s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764038086s@ mbc={}] start_peering_interval up [0,5,4] -> [4,5,0], acting [0,5,4] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:36 localhost ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.780620575s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764038086s@ mbc={}] state: transitioning to Stray Feb 23 02:58:36 localhost ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.782555580s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.765991211s@ mbc={}] start_peering_interval up [0,5,4] -> [4,5,0], acting [0,5,4] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:36 localhost ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.782437325s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.765991211s@ mbc={}] state: transitioning to Stray Feb 23 02:58:36 localhost ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.780421257s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1180.764526367s@ mbc={}] start_peering_interval up [0,5,4] -> [4,5,0], acting [0,5,4] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:36 localhost ceph-osd[32575]: osd.5 pg_epoch: 45 pg[7.2( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=45 pruub=10.780347824s) [4,5,0] r=1 lpr=45 pi=[41,45)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1180.764526367s@ mbc={}] state: transitioning to Stray Feb 23 02:58:37 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.12 deep-scrub starts Feb 23 02:58:38 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.14 deep-scrub starts Feb 23 02:58:39 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1e scrub starts Feb 23 02:58:39 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.8 deep-scrub starts Feb 23 02:58:40 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.e scrub starts Feb 23 02:58:42 localhost python3[56284]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:58:42 localhost python3[56329]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833521.8651876-92992-123673506454102/source _original_basename=tmpxhp5uara follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:58:42 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1e scrub ok Feb 23 02:58:43 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.6 scrub starts Feb 23 02:58:43 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.6 scrub ok Feb 23 02:58:43 localhost ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.887340546s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1197.873168945s@ mbc={}] start_peering_interval up [0,2,4] -> [1,3,2], acting [0,2,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:43 localhost ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.879220009s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1197.864990234s@ mbc={}] start_peering_interval up [0,2,4] -> [1,3,2], acting [0,2,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:43 localhost ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.887211800s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1197.873168945s@ mbc={}] state: transitioning to Stray Feb 23 02:58:43 localhost ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.878021240s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1197.864257812s@ mbc={}] start_peering_interval up [0,2,4] -> [1,3,2], acting [0,2,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:43 localhost ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=43/44 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.868871689s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1197.854614258s@ mbc={}] start_peering_interval up [0,2,4] -> [1,3,2], acting [0,2,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:43 localhost ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.877858162s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1197.864257812s@ mbc={}] state: transitioning to Stray Feb 23 02:58:43 localhost ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.3( v 36'39 (0'0,36'39] local-lis/les=43/44 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.868235588s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1197.854614258s@ mbc={}] state: transitioning to Stray Feb 23 02:58:43 localhost ceph-osd[31633]: osd.2 pg_epoch: 47 pg[7.b( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=47 pruub=15.878250122s) [1,3,2] r=2 lpr=47 pi=[43,47)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1197.864990234s@ mbc={}] state: transitioning to Stray Feb 23 02:58:43 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.2 scrub starts Feb 23 02:58:43 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.2 scrub ok Feb 23 02:58:43 localhost python3[56391]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:58:44 localhost python3[56434]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833523.4632533-93080-128830097371912/source _original_basename=tmpww2g8u89 follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:58:44 localhost python3[56464]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Feb 23 02:58:45 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.12 scrub starts Feb 23 02:58:45 localhost python3[56482]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:58:45 localhost ceph-osd[32575]: osd.5 pg_epoch: 49 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49 pruub=9.516323090s) [1,3,2] r=-1 lpr=49 pi=[41,49)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.767578125s@ mbc={}] start_peering_interval up [0,5,4] -> [1,3,2], acting [0,5,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:45 localhost ceph-osd[32575]: osd.5 pg_epoch: 49 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49 pruub=9.514539719s) [1,3,2] r=-1 lpr=49 pi=[41,49)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1188.765991211s@ mbc={}] start_peering_interval up [0,5,4] -> [1,3,2], acting [0,5,4] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:45 localhost ceph-osd[32575]: osd.5 pg_epoch: 49 pg[7.4( v 36'39 (0'0,36'39] local-lis/les=41/42 n=2 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49 pruub=9.516221046s) [1,3,2] r=-1 lpr=49 pi=[41,49)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.767578125s@ mbc={}] state: transitioning to Stray Feb 23 02:58:45 localhost ceph-osd[32575]: osd.5 pg_epoch: 49 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49 pruub=9.514459610s) [1,3,2] r=-1 lpr=49 pi=[41,49)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1188.765991211s@ mbc={}] state: transitioning to Stray Feb 23 02:58:46 localhost ceph-osd[31633]: osd.2 pg_epoch: 49 pg[7.4( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49) [1,3,2] r=2 lpr=49 pi=[41,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:46 localhost ceph-osd[31633]: osd.2 pg_epoch: 49 pg[7.c( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=49) [1,3,2] r=2 lpr=49 pi=[41,49)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:58:46 localhost ansible-async_wrapper.py[56654]: Invoked with 726138795097 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833526.3414202-93440-221100782994957/AnsiballZ_command.py _ Feb 23 02:58:46 localhost ansible-async_wrapper.py[56657]: Starting module and watcher Feb 23 02:58:46 localhost ansible-async_wrapper.py[56657]: Start watching 56658 (3600) Feb 23 02:58:46 localhost ansible-async_wrapper.py[56658]: Start module (56658) Feb 23 02:58:46 localhost ansible-async_wrapper.py[56654]: Return async_wrapper task started. Feb 23 02:58:47 localhost python3[56678]: ansible-ansible.legacy.async_status Invoked with jid=726138795097.56654 mode=status _async_dir=/tmp/.ansible_async Feb 23 02:58:48 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.d scrub starts Feb 23 02:58:48 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.d scrub ok Feb 23 02:58:49 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.b scrub starts Feb 23 02:58:49 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.b scrub ok Feb 23 02:58:49 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1d scrub starts Feb 23 02:58:49 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1d scrub ok Feb 23 02:58:50 localhost puppet-user[56676]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 02:58:50 localhost puppet-user[56676]: (file: /etc/puppet/hiera.yaml) Feb 23 02:58:50 localhost puppet-user[56676]: Warning: Undefined variable '::deploy_config_name'; Feb 23 02:58:50 localhost puppet-user[56676]: (file & line not available) Feb 23 02:58:50 localhost puppet-user[56676]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 02:58:50 localhost puppet-user[56676]: (file & line not available) Feb 23 02:58:50 localhost puppet-user[56676]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 23 02:58:50 localhost puppet-user[56676]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 23 02:58:50 localhost puppet-user[56676]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.14 seconds Feb 23 02:58:51 localhost puppet-user[56676]: Notice: Applied catalog in 0.04 seconds Feb 23 02:58:51 localhost puppet-user[56676]: Application: Feb 23 02:58:51 localhost puppet-user[56676]: Initial environment: production Feb 23 02:58:51 localhost puppet-user[56676]: Converged environment: production Feb 23 02:58:51 localhost puppet-user[56676]: Run mode: user Feb 23 02:58:51 localhost puppet-user[56676]: Changes: Feb 23 02:58:51 localhost puppet-user[56676]: Events: Feb 23 02:58:51 localhost puppet-user[56676]: Resources: Feb 23 02:58:51 localhost puppet-user[56676]: Total: 10 Feb 23 02:58:51 localhost puppet-user[56676]: Time: Feb 23 02:58:51 localhost puppet-user[56676]: Schedule: 0.00 Feb 23 02:58:51 localhost puppet-user[56676]: File: 0.00 Feb 23 02:58:51 localhost puppet-user[56676]: Exec: 0.01 Feb 23 02:58:51 localhost puppet-user[56676]: Augeas: 0.01 Feb 23 02:58:51 localhost puppet-user[56676]: Transaction evaluation: 0.04 Feb 23 02:58:51 localhost puppet-user[56676]: Catalog application: 0.04 Feb 23 02:58:51 localhost puppet-user[56676]: Config retrieval: 0.18 Feb 23 02:58:51 localhost puppet-user[56676]: Last run: 1771833531 Feb 23 02:58:51 localhost puppet-user[56676]: Filebucket: 0.00 Feb 23 02:58:51 localhost puppet-user[56676]: Total: 0.05 Feb 23 02:58:51 localhost puppet-user[56676]: Version: Feb 23 02:58:51 localhost puppet-user[56676]: Config: 1771833530 Feb 23 02:58:51 localhost puppet-user[56676]: Puppet: 7.10.0 Feb 23 02:58:51 localhost ansible-async_wrapper.py[56658]: Module complete (56658) Feb 23 02:58:51 localhost ansible-async_wrapper.py[56657]: Done in kid B. Feb 23 02:58:52 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1c scrub starts Feb 23 02:58:52 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1c scrub ok Feb 23 02:58:53 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.8 scrub starts Feb 23 02:58:53 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.8 scrub ok Feb 23 02:58:53 localhost ceph-osd[31633]: osd.2 pg_epoch: 51 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=43/44 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=13.619922638s) [2,4,0] r=0 lpr=51 pi=[43,51)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1205.873291016s@ mbc={}] start_peering_interval up [0,2,4] -> [2,4,0], acting [0,2,4] -> [2,4,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:53 localhost ceph-osd[31633]: osd.2 pg_epoch: 51 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=43/44 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=13.619922638s) [2,4,0] r=0 lpr=51 pi=[43,51)/1 crt=36'39 mlcod 0'0 unknown pruub 1205.873291016s@ mbc={}] state: transitioning to Primary Feb 23 02:58:53 localhost ceph-osd[31633]: osd.2 pg_epoch: 51 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=13.609990120s) [2,4,0] r=0 lpr=51 pi=[43,51)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1205.864257812s@ mbc={}] start_peering_interval up [0,2,4] -> [2,4,0], acting [0,2,4] -> [2,4,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:53 localhost ceph-osd[31633]: osd.2 pg_epoch: 51 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51 pruub=13.609990120s) [2,4,0] r=0 lpr=51 pi=[43,51)/1 crt=36'39 mlcod 0'0 unknown pruub 1205.864257812s@ mbc={}] state: transitioning to Primary Feb 23 02:58:54 localhost ceph-osd[31633]: osd.2 pg_epoch: 52 pg[7.5( v 36'39 (0'0,36'39] local-lis/les=51/52 n=2 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51) [2,4,0] r=0 lpr=51 pi=[43,51)/1 crt=36'39 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:54 localhost ceph-osd[31633]: osd.2 pg_epoch: 52 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=51/52 n=1 ec=41/34 lis/c=43/43 les/c/f=44/46/0 sis=51) [2,4,0] r=0 lpr=51 pi=[43,51)/1 crt=36'39 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:58:55 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.15 scrub starts Feb 23 02:58:55 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.15 scrub ok Feb 23 02:58:55 localhost ceph-osd[32575]: osd.5 pg_epoch: 53 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=45/46 n=2 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=13.583827972s) [1,0,5] r=2 lpr=53 pi=[45,53)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1203.042968750s@ mbc={}] start_peering_interval up [4,5,0] -> [1,0,5], acting [4,5,0] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:55 localhost ceph-osd[32575]: osd.5 pg_epoch: 53 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=45/46 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=13.579640388s) [1,0,5] r=2 lpr=53 pi=[45,53)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1203.038696289s@ mbc={}] start_peering_interval up [4,5,0] -> [1,0,5], acting [4,5,0] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:58:55 localhost ceph-osd[32575]: osd.5 pg_epoch: 53 pg[7.6( v 36'39 (0'0,36'39] local-lis/les=45/46 n=2 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=13.583735466s) [1,0,5] r=2 lpr=53 pi=[45,53)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1203.042968750s@ mbc={}] state: transitioning to Stray Feb 23 02:58:55 localhost ceph-osd[32575]: osd.5 pg_epoch: 53 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=45/46 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=13.579553604s) [1,0,5] r=2 lpr=53 pi=[45,53)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1203.038696289s@ mbc={}] state: transitioning to Stray Feb 23 02:58:56 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.4 scrub starts Feb 23 02:58:56 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.4 scrub ok Feb 23 02:58:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:58:57 localhost systemd[1]: tmp-crun.YsNcMR.mount: Deactivated successfully. Feb 23 02:58:57 localhost podman[56932]: 2026-02-23 07:58:57.542230134 +0000 UTC m=+0.097080617 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 02:58:57 localhost python3[56931]: ansible-ansible.legacy.async_status Invoked with jid=726138795097.56654 mode=status _async_dir=/tmp/.ansible_async Feb 23 02:58:57 localhost podman[56932]: 2026-02-23 07:58:57.729860551 +0000 UTC m=+0.284710964 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., release=1766032510, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container) Feb 23 02:58:57 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:58:58 localhost python3[56978]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:58:58 localhost python3[56994]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:58:59 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.9 deep-scrub starts Feb 23 02:58:59 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.9 deep-scrub ok Feb 23 02:58:59 localhost python3[57044]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:58:59 localhost python3[57062]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpy_6141th recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 02:58:59 localhost python3[57092]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:00 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1b scrub starts Feb 23 02:59:00 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.1b scrub ok Feb 23 02:59:00 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.1a scrub starts Feb 23 02:59:00 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.1a scrub ok Feb 23 02:59:01 localhost python3[57195]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 23 02:59:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 02:59:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4315 writes, 20K keys, 4315 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4315 writes, 358 syncs, 12.05 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1058 writes, 3920 keys, 1058 commit groups, 1.0 writes per commit group, ingest: 1.69 MB, 0.00 MB/s#012Interval WAL: 1058 writes, 214 syncs, 4.94 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 m Feb 23 02:59:01 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1f scrub starts Feb 23 02:59:01 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1f scrub ok Feb 23 02:59:01 localhost python3[57214]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:02 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.17 scrub starts Feb 23 02:59:02 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.17 scrub ok Feb 23 02:59:02 localhost python3[57246]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 02:59:03 localhost python3[57296]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:59:03 localhost ceph-osd[31633]: osd.2 pg_epoch: 55 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=47/48 n=1 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=12.910213470s) [3,5,1] r=-1 lpr=55 pi=[47,55)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1215.075073242s@ mbc={}] start_peering_interval up [1,3,2] -> [3,5,1], acting [1,3,2] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:03 localhost ceph-osd[31633]: osd.2 pg_epoch: 55 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=47/48 n=1 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=12.910005569s) [3,5,1] r=-1 lpr=55 pi=[47,55)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1215.075073242s@ mbc={}] state: transitioning to Stray Feb 23 02:59:03 localhost ceph-osd[31633]: osd.2 pg_epoch: 55 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=47/48 n=1 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=12.909722328s) [3,5,1] r=-1 lpr=55 pi=[47,55)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1215.074951172s@ mbc={}] start_peering_interval up [1,3,2] -> [3,5,1], acting [1,3,2] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:03 localhost ceph-osd[31633]: osd.2 pg_epoch: 55 pg[7.7( v 36'39 (0'0,36'39] local-lis/les=47/48 n=1 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=12.909592628s) [3,5,1] r=-1 lpr=55 pi=[47,55)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1215.074951172s@ mbc={}] state: transitioning to Stray Feb 23 02:59:03 localhost python3[57314]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:04 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.13 scrub starts Feb 23 02:59:04 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.13 scrub ok Feb 23 02:59:04 localhost python3[57376]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:59:04 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1 scrub starts Feb 23 02:59:04 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1 scrub ok Feb 23 02:59:04 localhost python3[57394]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:04 localhost ceph-osd[32575]: osd.5 pg_epoch: 55 pg[7.7( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55) [3,5,1] r=1 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:59:04 localhost ceph-osd[32575]: osd.5 pg_epoch: 55 pg[7.f( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=47/47 les/c/f=48/48/0 sis=55) [3,5,1] r=1 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:59:05 localhost python3[57456]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:59:05 localhost python3[57474]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:05 localhost ceph-osd[32575]: osd.5 pg_epoch: 57 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=57 pruub=13.205279350s) [1,0,5] r=2 lpr=57 pi=[41,57)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1212.765014648s@ mbc={}] start_peering_interval up [0,5,4] -> [1,0,5], acting [0,5,4] -> [1,0,5], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:05 localhost ceph-osd[32575]: osd.5 pg_epoch: 57 pg[7.8( v 36'39 (0'0,36'39] local-lis/les=41/42 n=1 ec=41/34 lis/c=41/41 les/c/f=42/42/0 sis=57 pruub=13.205115318s) [1,0,5] r=2 lpr=57 pi=[41,57)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1212.765014648s@ mbc={}] state: transitioning to Stray Feb 23 02:59:06 localhost python3[57536]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:59:06 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.12 scrub starts Feb 23 02:59:06 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 6.12 scrub ok Feb 23 02:59:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 02:59:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4867 writes, 22K keys, 4867 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4867 writes, 489 syncs, 9.95 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1482 writes, 5427 keys, 1482 commit groups, 1.0 writes per commit group, ingest: 2.10 MB, 0.00 MB/s#012Interval WAL: 1482 writes, 293 syncs, 5.06 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 6.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me Feb 23 02:59:06 localhost python3[57554]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:06 localhost sshd[57569]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:59:06 localhost python3[57586]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:59:06 localhost systemd[1]: Reloading. Feb 23 02:59:06 localhost systemd-sysv-generator[57616]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:59:06 localhost systemd-rc-local-generator[57610]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:59:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:59:07 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.14 scrub starts Feb 23 02:59:07 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 4.14 scrub ok Feb 23 02:59:07 localhost python3[57672]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:59:08 localhost python3[57690]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:08 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.12 scrub starts Feb 23 02:59:08 localhost ceph-osd[32575]: log_channel(cluster) log [DBG] : 5.12 scrub ok Feb 23 02:59:08 localhost python3[57752]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 02:59:08 localhost python3[57770]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:09 localhost python3[57800]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 02:59:09 localhost systemd[1]: Reloading. Feb 23 02:59:09 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.19 scrub starts Feb 23 02:59:09 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.19 scrub ok Feb 23 02:59:09 localhost systemd-rc-local-generator[57822]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 02:59:09 localhost systemd-sysv-generator[57828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 02:59:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 02:59:09 localhost systemd[1]: Starting Create netns directory... Feb 23 02:59:09 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 02:59:09 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 02:59:09 localhost systemd[1]: Finished Create netns directory. Feb 23 02:59:10 localhost python3[57857]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 23 02:59:11 localhost python3[57916]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 23 02:59:12 localhost podman[57993]: 2026-02-23 07:59:12.241465965 +0000 UTC m=+0.075517733 container create bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, container_name=nova_compute_init_log, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step2, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 23 02:59:12 localhost podman[57994]: 2026-02-23 07:59:12.277943745 +0000 UTC m=+0.102908619 container create cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_id=tripleo_step2, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Feb 23 02:59:12 localhost systemd[1]: Started libpod-conmon-bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b.scope. Feb 23 02:59:12 localhost podman[57993]: 2026-02-23 07:59:12.201190866 +0000 UTC m=+0.035242614 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 02:59:12 localhost systemd[1]: Started libcrun container. Feb 23 02:59:12 localhost systemd[1]: Started libpod-conmon-cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64.scope. Feb 23 02:59:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f56bdf141506d099102e067531f1fbfb1e40a67a70799f11577fc9c27fb9f83a/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 23 02:59:12 localhost podman[57994]: 2026-02-23 07:59:12.228451217 +0000 UTC m=+0.053416081 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 02:59:12 localhost podman[57993]: 2026-02-23 07:59:12.330281252 +0000 UTC m=+0.164333020 container init bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_compute_init_log) Feb 23 02:59:12 localhost systemd[1]: Started libcrun container. Feb 23 02:59:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7e11ab6b4147a24f37e25dab2cf55bde3a4412e647a5968367e3a7c4331cac7/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Feb 23 02:59:12 localhost podman[57993]: 2026-02-23 07:59:12.337014882 +0000 UTC m=+0.171066660 container start bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step2, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, container_name=nova_compute_init_log, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 23 02:59:12 localhost systemd[1]: libpod-bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b.scope: Deactivated successfully. Feb 23 02:59:12 localhost python3[57916]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Feb 23 02:59:12 localhost podman[57994]: 2026-02-23 07:59:12.346957234 +0000 UTC m=+0.171922098 container init cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, container_name=nova_virtqemud_init_logs, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 02:59:12 localhost podman[57994]: 2026-02-23 07:59:12.354779907 +0000 UTC m=+0.179744771 container start cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtqemud_init_logs, build-date=2026-01-12T23:31:49Z, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 02:59:12 localhost python3[57916]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Feb 23 02:59:12 localhost systemd[1]: libpod-cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64.scope: Deactivated successfully. Feb 23 02:59:12 localhost podman[58044]: 2026-02-23 07:59:12.41497266 +0000 UTC m=+0.044144382 container died cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, config_id=tripleo_step2, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, container_name=nova_virtqemud_init_logs, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 02:59:12 localhost podman[58030]: 2026-02-23 07:59:12.474961755 +0000 UTC m=+0.116297397 container died bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.expose-services=, release=1766032510, container_name=nova_compute_init_log, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step2, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 02:59:12 localhost podman[58050]: 2026-02-23 07:59:12.517138854 +0000 UTC m=+0.135297321 container cleanup cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step2, architecture=x86_64, url=https://www.redhat.com, container_name=nova_virtqemud_init_logs, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 02:59:12 localhost systemd[1]: libpod-conmon-cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64.scope: Deactivated successfully. Feb 23 02:59:12 localhost podman[58032]: 2026-02-23 07:59:12.648234494 +0000 UTC m=+0.291623140 container cleanup bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, container_name=nova_compute_init_log, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 23 02:59:12 localhost systemd[1]: libpod-conmon-bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b.scope: Deactivated successfully. Feb 23 02:59:12 localhost podman[58178]: 2026-02-23 07:59:12.929563891 +0000 UTC m=+0.100940707 container create 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=create_virtlogd_wrapper, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step2, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt) Feb 23 02:59:12 localhost podman[58179]: 2026-02-23 07:59:12.949534655 +0000 UTC m=+0.113799239 container create c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, config_id=tripleo_step2, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, batch=17.1_20260112.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=) Feb 23 02:59:12 localhost podman[58178]: 2026-02-23 07:59:12.878299047 +0000 UTC m=+0.049675883 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 02:59:12 localhost systemd[1]: Started libpod-conmon-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4.scope. Feb 23 02:59:12 localhost podman[58179]: 2026-02-23 07:59:12.888234268 +0000 UTC m=+0.052498852 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 23 02:59:12 localhost systemd[1]: Started libpod-conmon-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36.scope. Feb 23 02:59:12 localhost systemd[1]: Started libcrun container. Feb 23 02:59:13 localhost systemd[1]: Started libcrun container. Feb 23 02:59:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3d90fa516ab23da2bd722ed78a874566a7daf8e8d3d852895f80962cb5a1d59/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 23 02:59:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eaf5c828e8984d86d81a6eee5a482e70c553115148192fac48b0718754776f54/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 02:59:13 localhost podman[58178]: 2026-02-23 07:59:13.012726271 +0000 UTC m=+0.184103097 container init 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=create_virtlogd_wrapper, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 23 02:59:13 localhost podman[58179]: 2026-02-23 07:59:13.01526222 +0000 UTC m=+0.179526804 container init c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step2, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 02:59:13 localhost podman[58178]: 2026-02-23 07:59:13.023220089 +0000 UTC m=+0.194596915 container start 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_id=tripleo_step2, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container) Feb 23 02:59:13 localhost podman[58178]: 2026-02-23 07:59:13.023515028 +0000 UTC m=+0.194891854 container attach 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step2, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, container_name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.expose-services=) Feb 23 02:59:13 localhost podman[58179]: 2026-02-23 07:59:13.075012488 +0000 UTC m=+0.239277072 container start c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step2, version=17.1.13, container_name=create_haproxy_wrapper, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 02:59:13 localhost podman[58179]: 2026-02-23 07:59:13.075489844 +0000 UTC m=+0.239754478 container attach c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step2) Feb 23 02:59:13 localhost systemd[1]: var-lib-containers-storage-overlay-e7e11ab6b4147a24f37e25dab2cf55bde3a4412e647a5968367e3a7c4331cac7-merged.mount: Deactivated successfully. Feb 23 02:59:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cad8a5ff693667746f9d919cd3db2e0eb6451f9ad1ca0a5475ab95ac31073c64-userdata-shm.mount: Deactivated successfully. Feb 23 02:59:13 localhost systemd[1]: var-lib-containers-storage-overlay-f56bdf141506d099102e067531f1fbfb1e40a67a70799f11577fc9c27fb9f83a-merged.mount: Deactivated successfully. Feb 23 02:59:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bdb13ba4b492bf8033ec5cf92c719fd10dc0a22d07c6f677643c18f7f61a227b-userdata-shm.mount: Deactivated successfully. Feb 23 02:59:13 localhost ceph-osd[31633]: osd.2 pg_epoch: 59 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/44/0 sis=59 pruub=9.269790649s) [4,2,3] r=1 lpr=59 pi=[43,59)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1221.865112305s@ mbc={}] start_peering_interval up [0,2,4] -> [4,2,3], acting [0,2,4] -> [4,2,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:13 localhost ceph-osd[31633]: osd.2 pg_epoch: 59 pg[7.9( v 36'39 (0'0,36'39] local-lis/les=43/44 n=1 ec=41/34 lis/c=43/43 les/c/f=44/44/0 sis=59 pruub=9.269694328s) [4,2,3] r=1 lpr=59 pi=[43,59)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1221.865112305s@ mbc={}] state: transitioning to Stray Feb 23 02:59:14 localhost ovs-vsctl[58282]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Feb 23 02:59:15 localhost systemd[1]: libpod-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4.scope: Deactivated successfully. Feb 23 02:59:15 localhost systemd[1]: libpod-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4.scope: Consumed 2.189s CPU time. Feb 23 02:59:15 localhost podman[58178]: 2026-02-23 07:59:15.209158951 +0000 UTC m=+2.380535797 container died 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 23 02:59:15 localhost systemd[1]: tmp-crun.lHo5ud.mount: Deactivated successfully. Feb 23 02:59:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4-userdata-shm.mount: Deactivated successfully. Feb 23 02:59:15 localhost systemd[1]: var-lib-containers-storage-overlay-c3d90fa516ab23da2bd722ed78a874566a7daf8e8d3d852895f80962cb5a1d59-merged.mount: Deactivated successfully. Feb 23 02:59:15 localhost podman[58431]: 2026-02-23 07:59:15.326521691 +0000 UTC m=+0.105323774 container cleanup 9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, container_name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step2, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 02:59:15 localhost systemd[1]: libpod-conmon-9cd8a900e61995a816f43443ec7eb7bdcb3798f95e9d558c75665ba510a0aca4.scope: Deactivated successfully. Feb 23 02:59:15 localhost python3[57916]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Feb 23 02:59:15 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.3 scrub starts Feb 23 02:59:15 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.3 scrub ok Feb 23 02:59:15 localhost systemd[1]: libpod-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36.scope: Deactivated successfully. Feb 23 02:59:15 localhost systemd[1]: libpod-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36.scope: Consumed 2.269s CPU time. Feb 23 02:59:15 localhost podman[58179]: 2026-02-23 07:59:15.954650392 +0000 UTC m=+3.118914976 container died c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step2, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.) Feb 23 02:59:16 localhost ceph-osd[32575]: osd.5 pg_epoch: 61 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=45/46 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=9.284608841s) [2,4,3] r=-1 lpr=61 pi=[45,61)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1219.043090820s@ mbc={}] start_peering_interval up [4,5,0] -> [2,4,3], acting [4,5,0] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:16 localhost ceph-osd[32575]: osd.5 pg_epoch: 61 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=45/46 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=9.284519196s) [2,4,3] r=-1 lpr=61 pi=[45,61)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1219.043090820s@ mbc={}] state: transitioning to Stray Feb 23 02:59:16 localhost ceph-osd[31633]: osd.2 pg_epoch: 61 pg[7.a( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=61) [2,4,3] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Feb 23 02:59:16 localhost podman[58471]: 2026-02-23 07:59:16.062501905 +0000 UTC m=+0.092565356 container cleanup c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, release=1766032510, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, container_name=create_haproxy_wrapper, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 23 02:59:16 localhost systemd[1]: libpod-conmon-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36.scope: Deactivated successfully. Feb 23 02:59:16 localhost python3[57916]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Feb 23 02:59:16 localhost systemd[1]: var-lib-containers-storage-overlay-eaf5c828e8984d86d81a6eee5a482e70c553115148192fac48b0718754776f54-merged.mount: Deactivated successfully. Feb 23 02:59:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1ab6b8b4cb9390f2f764277e6ce50695b194b3a53a4862001a92a1a68dbfd36-userdata-shm.mount: Deactivated successfully. Feb 23 02:59:16 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 6.1 scrub starts Feb 23 02:59:16 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 6.1 scrub ok Feb 23 02:59:16 localhost python3[58527]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:17 localhost ceph-osd[31633]: osd.2 pg_epoch: 62 pg[7.a( v 36'39 (0'0,36'39] local-lis/les=61/62 n=1 ec=41/34 lis/c=45/45 les/c/f=46/46/0 sis=61) [2,4,3] r=0 lpr=61 pi=[45,61)/1 crt=36'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:59:18 localhost python3[58648]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005626463 step=2 update_config_hash_only=False Feb 23 02:59:18 localhost python3[58664]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 02:59:19 localhost python3[58680]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 23 02:59:23 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1c deep-scrub starts Feb 23 02:59:23 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 4.1c deep-scrub ok Feb 23 02:59:24 localhost ceph-osd[31633]: osd.2 64 crush map has features 432629239337189376, adjusting msgr requires for clients Feb 23 02:59:24 localhost ceph-osd[31633]: osd.2 64 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons Feb 23 02:59:24 localhost ceph-osd[31633]: osd.2 64 crush map has features 3314933000854323200, adjusting msgr requires for osds Feb 23 02:59:24 localhost ceph-osd[31633]: osd.2 pg_epoch: 64 pg[4.1( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64 pruub=15.048739433s) [2,0,4] r=0 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 active pruub 1237.859008789s@ mbc={}] start_peering_interval up [2,0,1] -> [2,0,4], acting [2,0,1] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:24 localhost ceph-osd[31633]: osd.2 pg_epoch: 64 pg[4.1( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64 pruub=15.048739433s) [2,0,4] r=0 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 unknown pruub 1237.859008789s@ mbc={}] state: transitioning to Primary Feb 23 02:59:24 localhost ceph-osd[31633]: osd.2 pg_epoch: 64 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=49/50 n=1 ec=41/34 lis/c=49/49 les/c/f=50/50/0 sis=64 pruub=10.373780251s) [0,1,2] r=2 lpr=64 pi=[49,64)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1233.184448242s@ mbc={}] start_peering_interval up [1,3,2] -> [0,1,2], acting [1,3,2] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:24 localhost ceph-osd[31633]: osd.2 pg_epoch: 64 pg[7.c( v 36'39 (0'0,36'39] local-lis/les=49/50 n=1 ec=41/34 lis/c=49/49 les/c/f=50/50/0 sis=64 pruub=10.373656273s) [0,1,2] r=2 lpr=64 pi=[49,64)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1233.184448242s@ mbc={}] state: transitioning to Stray Feb 23 02:59:24 localhost ceph-osd[32575]: osd.5 64 crush map has features 432629239337189376, adjusting msgr requires for clients Feb 23 02:59:24 localhost ceph-osd[32575]: osd.5 64 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons Feb 23 02:59:24 localhost ceph-osd[32575]: osd.5 64 crush map has features 3314933000854323200, adjusting msgr requires for osds Feb 23 02:59:24 localhost ceph-osd[32575]: osd.5 pg_epoch: 64 pg[4.1b( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64 pruub=15.022865295s) [3,4,5] r=2 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 active pruub 1232.983154297s@ mbc={}] start_peering_interval up [3,1,5] -> [3,4,5], acting [3,1,5] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:24 localhost ceph-osd[32575]: osd.5 pg_epoch: 64 pg[4.1b( empty local-lis/les=43/44 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64 pruub=15.022788048s) [3,4,5] r=2 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1232.983154297s@ mbc={}] state: transitioning to Stray Feb 23 02:59:24 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.d scrub starts Feb 23 02:59:24 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.d scrub ok Feb 23 02:59:25 localhost ceph-osd[31633]: osd.2 pg_epoch: 65 pg[4.1( empty local-lis/les=64/65 n=0 ec=39/24 lis/c=43/43 les/c/f=44/44/0 sis=64) [2,0,4] r=0 lpr=64 pi=[43,64)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Feb 23 02:59:26 localhost ceph-osd[31633]: osd.2 pg_epoch: 66 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=51/52 n=2 ec=41/34 lis/c=51/51 les/c/f=52/52/0 sis=66 pruub=8.429811478s) [3,4,5] r=-1 lpr=66 pi=[51,66)/1 crt=36'39 mlcod 0'0 active pruub 1233.304321289s@ mbc={}] start_peering_interval up [2,4,0] -> [3,4,5], acting [2,4,0] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:26 localhost ceph-osd[31633]: osd.2 pg_epoch: 66 pg[7.d( v 36'39 (0'0,36'39] local-lis/les=51/52 n=2 ec=41/34 lis/c=51/51 les/c/f=52/52/0 sis=66 pruub=8.429599762s) [3,4,5] r=-1 lpr=66 pi=[51,66)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1233.304321289s@ mbc={}] state: transitioning to Stray Feb 23 02:59:27 localhost sshd[58681]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:59:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:59:27 localhost podman[58682]: 2026-02-23 07:59:27.92364386 +0000 UTC m=+0.090258253 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=) Feb 23 02:59:28 localhost podman[58682]: 2026-02-23 07:59:28.141993688 +0000 UTC m=+0.308608051 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible) Feb 23 02:59:28 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 02:59:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 66 pg[7.d( empty local-lis/les=0/0 n=0 ec=41/34 lis/c=51/51 les/c/f=52/52/0 sis=66) [3,4,5] r=2 lpr=66 pi=[51,66)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Feb 23 02:59:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 68 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=53/54 n=1 ec=41/34 lis/c=53/53 les/c/f=54/54/0 sis=68 pruub=8.680847168s) [1,5,3] r=1 lpr=68 pi=[53,68)/1 luod=0'0 crt=36'39 lcod 0'0 mlcod 0'0 active pruub 1230.706176758s@ mbc={}] start_peering_interval up [1,0,5] -> [1,5,3], acting [1,0,5] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:28 localhost ceph-osd[32575]: osd.5 pg_epoch: 68 pg[7.e( v 36'39 (0'0,36'39] local-lis/les=53/54 n=1 ec=41/34 lis/c=53/53 les/c/f=54/54/0 sis=68 pruub=8.680780411s) [1,5,3] r=1 lpr=68 pi=[53,68)/1 crt=36'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1230.706176758s@ mbc={}] state: transitioning to Stray Feb 23 02:59:28 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.5 deep-scrub starts Feb 23 02:59:28 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.5 deep-scrub ok Feb 23 02:59:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 69 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=55/56 n=1 ec=41/34 lis/c=55/55 les/c/f=56/56/0 sis=69 pruub=15.563361168s) [1,5,3] r=1 lpr=69 pi=[55,69)/1 luod=0'0 crt=36'39 mlcod 0'0 active pruub 1238.626098633s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Feb 23 02:59:29 localhost ceph-osd[32575]: osd.5 pg_epoch: 69 pg[7.f( v 36'39 (0'0,36'39] local-lis/les=55/56 n=1 ec=41/34 lis/c=55/55 les/c/f=56/56/0 sis=69 pruub=15.563264847s) [1,5,3] r=1 lpr=69 pi=[55,69)/1 crt=36'39 mlcod 0'0 unknown NOTIFY pruub 1238.626098633s@ mbc={}] state: transitioning to Stray Feb 23 02:59:30 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.a scrub starts Feb 23 02:59:30 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 7.a scrub ok Feb 23 02:59:33 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.8 scrub starts Feb 23 02:59:33 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.8 scrub ok Feb 23 02:59:37 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.e scrub starts Feb 23 02:59:37 localhost ceph-osd[31633]: log_channel(cluster) log [DBG] : 5.e scrub ok Feb 23 02:59:45 localhost sshd[58711]: main: sshd: ssh-rsa algorithm is disabled Feb 23 02:59:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 02:59:58 localhost systemd[1]: tmp-crun.rILy3a.mount: Deactivated successfully. Feb 23 02:59:58 localhost podman[58789]: 2026-02-23 07:59:58.914831452 +0000 UTC m=+0.091897135 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 02:59:59 localhost podman[58789]: 2026-02-23 07:59:59.113222855 +0000 UTC m=+0.290288518 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public) Feb 23 02:59:59 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:00:24 localhost sshd[58819]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:00:27 localhost sshd[58821]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:00:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:00:29 localhost systemd[1]: tmp-crun.z28wnv.mount: Deactivated successfully. Feb 23 03:00:29 localhost podman[58823]: 2026-02-23 08:00:29.922586569 +0000 UTC m=+0.093597371 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:00:30 localhost podman[58823]: 2026-02-23 08:00:30.113389302 +0000 UTC m=+0.284400104 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:00:30 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:00:58 localhost podman[58953]: 2026-02-23 08:00:58.422364951 +0000 UTC m=+0.106662709 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., version=7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=1770267347, ceph=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, RELEASE=main) Feb 23 03:00:58 localhost podman[58953]: 2026-02-23 08:00:58.528400973 +0000 UTC m=+0.212698731 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.buildah.version=1.42.2, RELEASE=main, ceph=True, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1770267347, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container) Feb 23 03:01:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:01:00 localhost podman[59095]: 2026-02-23 08:01:00.920347129 +0000 UTC m=+0.092793200 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:01:01 localhost podman[59095]: 2026-02-23 08:01:01.112687634 +0000 UTC m=+0.285133665 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:01:01 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:01:08 localhost sshd[59134]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:01:21 localhost sshd[59136]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:01:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:01:31 localhost podman[59138]: 2026-02-23 08:01:31.914124359 +0000 UTC m=+0.084158226 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vendor=Red Hat, Inc., url=https://www.redhat.com) Feb 23 03:01:32 localhost podman[59138]: 2026-02-23 08:01:32.105110502 +0000 UTC m=+0.275144769 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:01:32 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:01:50 localhost sshd[59167]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:02:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:02:02 localhost systemd[1]: tmp-crun.Nz7jYf.mount: Deactivated successfully. Feb 23 03:02:02 localhost podman[59246]: 2026-02-23 08:02:02.919152813 +0000 UTC m=+0.093362617 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com) Feb 23 03:02:03 localhost podman[59246]: 2026-02-23 08:02:03.163258592 +0000 UTC m=+0.337468366 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:02:03 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:02:14 localhost sshd[59276]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:02:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:02:33 localhost podman[59278]: 2026-02-23 08:02:33.904899982 +0000 UTC m=+0.080425265 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 23 03:02:34 localhost podman[59278]: 2026-02-23 08:02:34.095455404 +0000 UTC m=+0.270980677 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z) Feb 23 03:02:34 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:02:34 localhost sshd[59307]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:03:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:03:04 localhost systemd[1]: tmp-crun.3iAM2K.mount: Deactivated successfully. Feb 23 03:03:04 localhost podman[59386]: 2026-02-23 08:03:04.908657731 +0000 UTC m=+0.080516637 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:10:14Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, container_name=metrics_qdr, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:03:05 localhost podman[59386]: 2026-02-23 08:03:05.097354915 +0000 UTC m=+0.269213811 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:03:05 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:03:09 localhost sshd[59415]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:03:17 localhost sshd[59417]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:03:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=49960 SEQ=0 ACK=3124770151 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:03:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:03:35 localhost systemd[1]: tmp-crun.yanLUg.mount: Deactivated successfully. Feb 23 03:03:35 localhost podman[59419]: 2026-02-23 08:03:35.909360466 +0000 UTC m=+0.084747059 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true) Feb 23 03:03:36 localhost podman[59419]: 2026-02-23 08:03:36.104061848 +0000 UTC m=+0.279448461 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:03:36 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:03:55 localhost sshd[59448]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:03:56 localhost python3[59497]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:03:57 localhost python3[59542]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833836.58139-99598-278474677941260/source _original_basename=tmp9owi_7bf follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:03:58 localhost python3[59572]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:03:59 localhost sshd[59623]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:04:00 localhost ansible-async_wrapper.py[59746]: Invoked with 839121906300 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833839.8767023-99778-228112044019917/AnsiballZ_command.py _ Feb 23 03:04:00 localhost ansible-async_wrapper.py[59749]: Starting module and watcher Feb 23 03:04:00 localhost ansible-async_wrapper.py[59749]: Start watching 59750 (3600) Feb 23 03:04:00 localhost ansible-async_wrapper.py[59750]: Start module (59750) Feb 23 03:04:00 localhost ansible-async_wrapper.py[59746]: Return async_wrapper task started. Feb 23 03:04:00 localhost python3[59770]: ansible-ansible.legacy.async_status Invoked with jid=839121906300.59746 mode=status _async_dir=/tmp/.ansible_async Feb 23 03:04:03 localhost puppet-user[59754]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 03:04:03 localhost puppet-user[59754]: (file: /etc/puppet/hiera.yaml) Feb 23 03:04:03 localhost puppet-user[59754]: Warning: Undefined variable '::deploy_config_name'; Feb 23 03:04:03 localhost puppet-user[59754]: (file & line not available) Feb 23 03:04:04 localhost puppet-user[59754]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 03:04:04 localhost puppet-user[59754]: (file & line not available) Feb 23 03:04:04 localhost puppet-user[59754]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 23 03:04:04 localhost puppet-user[59754]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 23 03:04:04 localhost puppet-user[59754]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.13 seconds Feb 23 03:04:04 localhost puppet-user[59754]: Notice: Applied catalog in 0.05 seconds Feb 23 03:04:04 localhost puppet-user[59754]: Application: Feb 23 03:04:04 localhost puppet-user[59754]: Initial environment: production Feb 23 03:04:04 localhost puppet-user[59754]: Converged environment: production Feb 23 03:04:04 localhost puppet-user[59754]: Run mode: user Feb 23 03:04:04 localhost puppet-user[59754]: Changes: Feb 23 03:04:04 localhost puppet-user[59754]: Events: Feb 23 03:04:04 localhost puppet-user[59754]: Resources: Feb 23 03:04:04 localhost puppet-user[59754]: Total: 10 Feb 23 03:04:04 localhost puppet-user[59754]: Time: Feb 23 03:04:04 localhost puppet-user[59754]: Schedule: 0.00 Feb 23 03:04:04 localhost puppet-user[59754]: File: 0.00 Feb 23 03:04:04 localhost puppet-user[59754]: Augeas: 0.01 Feb 23 03:04:04 localhost puppet-user[59754]: Exec: 0.01 Feb 23 03:04:04 localhost puppet-user[59754]: Transaction evaluation: 0.04 Feb 23 03:04:04 localhost puppet-user[59754]: Catalog application: 0.05 Feb 23 03:04:04 localhost puppet-user[59754]: Config retrieval: 0.17 Feb 23 03:04:04 localhost puppet-user[59754]: Last run: 1771833844 Feb 23 03:04:04 localhost puppet-user[59754]: Filebucket: 0.00 Feb 23 03:04:04 localhost puppet-user[59754]: Total: 0.06 Feb 23 03:04:04 localhost puppet-user[59754]: Version: Feb 23 03:04:04 localhost puppet-user[59754]: Config: 1771833843 Feb 23 03:04:04 localhost puppet-user[59754]: Puppet: 7.10.0 Feb 23 03:04:04 localhost ansible-async_wrapper.py[59750]: Module complete (59750) Feb 23 03:04:05 localhost ansible-async_wrapper.py[59749]: Done in kid B. Feb 23 03:04:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:04:06 localhost podman[59958]: 2026-02-23 08:04:06.921104596 +0000 UTC m=+0.090164138 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, distribution-scope=public, vcs-type=git) Feb 23 03:04:07 localhost podman[59958]: 2026-02-23 08:04:07.157025711 +0000 UTC m=+0.326085243 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:04:07 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:04:11 localhost python3[60002]: ansible-ansible.legacy.async_status Invoked with jid=839121906300.59746 mode=status _async_dir=/tmp/.ansible_async Feb 23 03:04:11 localhost python3[60018]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 03:04:12 localhost python3[60034]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:12 localhost python3[60084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:04:13 localhost python3[60102]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpfm4zvh7f recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 03:04:13 localhost python3[60132]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:14 localhost python3[60236]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 23 03:04:15 localhost python3[60255]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:16 localhost python3[60287]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:17 localhost python3[60337]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:04:17 localhost python3[60355]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:18 localhost python3[60417]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:04:18 localhost python3[60435]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:19 localhost python3[60497]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:04:19 localhost python3[60515]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:20 localhost python3[60577]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:04:20 localhost python3[60595]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:20 localhost python3[60625]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:20 localhost systemd[1]: Reloading. Feb 23 03:04:21 localhost systemd-sysv-generator[60653]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:21 localhost systemd-rc-local-generator[60648]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:21 localhost python3[60710]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:04:22 localhost python3[60728]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:22 localhost python3[60790]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:04:22 localhost python3[60808]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:23 localhost python3[60838]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:23 localhost systemd[1]: Reloading. Feb 23 03:04:23 localhost systemd-rc-local-generator[60859]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:23 localhost systemd-sysv-generator[60863]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:23 localhost systemd[1]: Starting Create netns directory... Feb 23 03:04:23 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 03:04:23 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 03:04:23 localhost systemd[1]: Finished Create netns directory. Feb 23 03:04:24 localhost python3[60897]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 23 03:04:26 localhost python3[60956]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 23 03:04:27 localhost podman[61111]: 2026-02-23 08:04:27.183177363 +0000 UTC m=+0.075249914 container create 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, container_name=nova_virtlogd_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 03:04:27 localhost podman[61125]: 2026-02-23 08:04:27.188351979 +0000 UTC m=+0.070920175 container create 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, version=17.1.13, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:04:27 localhost podman[61140]: 2026-02-23 08:04:27.214351222 +0000 UTC m=+0.075489731 container create 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:09Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.5) Feb 23 03:04:27 localhost systemd[1]: Started libpod-conmon-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.scope. Feb 23 03:04:27 localhost systemd[1]: Started libcrun container. Feb 23 03:04:27 localhost podman[61111]: 2026-02-23 08:04:27.140020599 +0000 UTC m=+0.032093160 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b7b3f393818c1da1a59fd13309ca9cf26b2dd139b3696bb046bf52c3291b46/merged/scripts supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/70b7b3f393818c1da1a59fd13309ca9cf26b2dd139b3696bb046bf52c3291b46/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost podman[61125]: 2026-02-23 08:04:27.145677221 +0000 UTC m=+0.028245427 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 23 03:04:27 localhost podman[61138]: 2026-02-23 08:04:27.249449618 +0000 UTC m=+0.112535449 container create 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step3, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13) Feb 23 03:04:27 localhost systemd[1]: Started libpod-conmon-215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b.scope. Feb 23 03:04:27 localhost systemd[1]: Started libpod-conmon-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope. Feb 23 03:04:27 localhost systemd[1]: Started libpod-conmon-6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537.scope. Feb 23 03:04:27 localhost podman[61143]: 2026-02-23 08:04:27.175182456 +0000 UTC m=+0.032614666 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 03:04:27 localhost podman[61140]: 2026-02-23 08:04:27.178180462 +0000 UTC m=+0.039319001 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 23 03:04:27 localhost podman[61138]: 2026-02-23 08:04:27.178546484 +0000 UTC m=+0.041632315 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 23 03:04:27 localhost systemd[1]: Started libcrun container. Feb 23 03:04:27 localhost systemd[1]: Started libcrun container. Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost systemd[1]: Started libcrun container. Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e3fe691531a0d3ed4e0bd844aee95e09028b37c75ae9985cc1386696cb9ad2a/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost podman[61111]: 2026-02-23 08:04:27.293906584 +0000 UTC m=+0.185979145 container init 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, build-date=2026-01-12T23:31:49Z, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z) Feb 23 03:04:27 localhost podman[61111]: 2026-02-23 08:04:27.299662169 +0000 UTC m=+0.191734730 container start 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtlogd_wrapper, config_id=tripleo_step3, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, vcs-type=git) Feb 23 03:04:27 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:04:27 localhost podman[61125]: 2026-02-23 08:04:27.321376154 +0000 UTC m=+0.203944380 container init 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, container_name=collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:04:27 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:04:27 localhost podman[61143]: 2026-02-23 08:04:27.344096244 +0000 UTC m=+0.201528454 container create c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, version=17.1.13, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=nova_statedir_owner, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3) Feb 23 03:04:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:04:27 localhost podman[61140]: 2026-02-23 08:04:27.34773485 +0000 UTC m=+0.208873369 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, version=17.1.13, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.created=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog) Feb 23 03:04:27 localhost podman[61140]: 2026-02-23 08:04:27.357641367 +0000 UTC m=+0.218779886 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog) Feb 23 03:04:27 localhost systemd[1]: Created slice User Slice of UID 0. Feb 23 03:04:27 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:04:27 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 23 03:04:27 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=8e5028e38f7077561ef1e3e50ec174a3 --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Feb 23 03:04:27 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 23 03:04:27 localhost systemd[1]: Starting User Manager for UID 0... Feb 23 03:04:27 localhost podman[61138]: 2026-02-23 08:04:27.394939714 +0000 UTC m=+0.258025545 container init 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=ceilometer_init_log, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=) Feb 23 03:04:27 localhost podman[61125]: 2026-02-23 08:04:27.405265735 +0000 UTC m=+0.287833921 container start 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=) Feb 23 03:04:27 localhost systemd[1]: libpod-6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537.scope: Deactivated successfully. Feb 23 03:04:27 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=da9a0dc7b40588672419e3ce10063e21 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Feb 23 03:04:27 localhost systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully. Feb 23 03:04:27 localhost podman[61236]: 2026-02-23 08:04:27.440408612 +0000 UTC m=+0.060881393 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, version=17.1.13, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:04:27 localhost systemd[1]: Started libpod-conmon-c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6.scope. Feb 23 03:04:27 localhost podman[61138]: 2026-02-23 08:04:27.509805068 +0000 UTC m=+0.372890919 container start 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step3, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ceilometer_init_log, tcib_managed=true, vendor=Red Hat, Inc.) Feb 23 03:04:27 localhost systemd[1]: Started libcrun container. Feb 23 03:04:27 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Feb 23 03:04:27 localhost podman[61256]: 2026-02-23 08:04:27.514795678 +0000 UTC m=+0.096026451 container died 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, container_name=ceilometer_init_log, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:27 localhost systemd[61241]: Queued start job for default target Main User Target. Feb 23 03:04:27 localhost podman[61143]: 2026-02-23 08:04:27.523127124 +0000 UTC m=+0.380559324 container init c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, config_id=tripleo_step3, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_statedir_owner) Feb 23 03:04:27 localhost systemd[61241]: Created slice User Application Slice. Feb 23 03:04:27 localhost systemd[61241]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 23 03:04:27 localhost systemd[61241]: Started Daily Cleanup of User's Temporary Directories. Feb 23 03:04:27 localhost systemd[61241]: Reached target Paths. Feb 23 03:04:27 localhost systemd[61241]: Reached target Timers. Feb 23 03:04:27 localhost systemd[61241]: Starting D-Bus User Message Bus Socket... Feb 23 03:04:27 localhost systemd[61241]: Starting Create User's Volatile Files and Directories... Feb 23 03:04:27 localhost podman[61143]: 2026-02-23 08:04:27.530247403 +0000 UTC m=+0.387679603 container start c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Feb 23 03:04:27 localhost podman[61143]: 2026-02-23 08:04:27.530416218 +0000 UTC m=+0.387848408 container attach c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, distribution-scope=public, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_statedir_owner, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64) Feb 23 03:04:27 localhost systemd[61241]: Finished Create User's Volatile Files and Directories. Feb 23 03:04:27 localhost podman[61256]: 2026-02-23 08:04:27.540927476 +0000 UTC m=+0.122158209 container cleanup 6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step3, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, container_name=ceilometer_init_log, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:04:27 localhost systemd[61241]: Listening on D-Bus User Message Bus Socket. Feb 23 03:04:27 localhost systemd[61241]: Reached target Sockets. Feb 23 03:04:27 localhost systemd[61241]: Reached target Basic System. Feb 23 03:04:27 localhost systemd[61241]: Reached target Main User Target. Feb 23 03:04:27 localhost systemd[61241]: Startup finished in 131ms. Feb 23 03:04:27 localhost systemd[1]: Started User Manager for UID 0. Feb 23 03:04:27 localhost systemd[1]: Started Session c1 of User root. Feb 23 03:04:27 localhost systemd[1]: Started Session c2 of User root. Feb 23 03:04:27 localhost systemd[1]: libpod-conmon-6ae4af24cc7f8254adc5ccdc0b5d346c50b5973ee040979ebfa9ae1599478537.scope: Deactivated successfully. Feb 23 03:04:27 localhost systemd[1]: libpod-c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6.scope: Deactivated successfully. Feb 23 03:04:27 localhost podman[61143]: 2026-02-23 08:04:27.576465466 +0000 UTC m=+0.433897686 container died c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:04:27 localhost podman[61278]: 2026-02-23 08:04:27.622099499 +0000 UTC m=+0.173157514 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, release=1766032510, io.openshift.expose-services=, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:09Z, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Feb 23 03:04:27 localhost systemd[1]: libpod-conmon-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully. Feb 23 03:04:27 localhost systemd[1]: session-c1.scope: Deactivated successfully. Feb 23 03:04:27 localhost systemd[1]: session-c2.scope: Deactivated successfully. Feb 23 03:04:27 localhost podman[61225]: 2026-02-23 08:04:27.594186344 +0000 UTC m=+0.236925550 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, architecture=x86_64) Feb 23 03:04:27 localhost podman[61225]: 2026-02-23 08:04:27.677154434 +0000 UTC m=+0.319893620 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 23 03:04:27 localhost podman[61225]: unhealthy Feb 23 03:04:27 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:04:27 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed with result 'exit-code'. Feb 23 03:04:27 localhost podman[61363]: 2026-02-23 08:04:27.761150198 +0000 UTC m=+0.169936850 container cleanup c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:04:27 localhost systemd[1]: libpod-conmon-c65bb836e8986e21560c647871dbfc4241d11dbb3f691c6e6e88dd3439c9e7c6.scope: Deactivated successfully. Feb 23 03:04:27 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Feb 23 03:04:27 localhost podman[61473]: 2026-02-23 08:04:27.965315365 +0000 UTC m=+0.076738451 container create 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:04:28 localhost systemd[1]: Started libpod-conmon-3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74.scope. Feb 23 03:04:28 localhost systemd[1]: Started libcrun container. Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost podman[61473]: 2026-02-23 08:04:27.93428612 +0000 UTC m=+0.045709246 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:28 localhost podman[61473]: 2026-02-23 08:04:28.034134573 +0000 UTC m=+0.145557739 container init 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc.) Feb 23 03:04:28 localhost podman[61473]: 2026-02-23 08:04:28.040495736 +0000 UTC m=+0.151918822 container start 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, version=17.1.13, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, distribution-scope=public) Feb 23 03:04:28 localhost systemd[1]: var-lib-containers-storage-overlay-abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d-merged.mount: Deactivated successfully. Feb 23 03:04:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4-userdata-shm.mount: Deactivated successfully. Feb 23 03:04:28 localhost podman[61536]: 2026-02-23 08:04:28.237567637 +0000 UTC m=+0.087666583 container create c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_virtsecretd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:04:28 localhost podman[61536]: 2026-02-23 08:04:28.191152968 +0000 UTC m=+0.041251934 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:28 localhost systemd[1]: Started libpod-conmon-c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f.scope. Feb 23 03:04:28 localhost systemd[1]: Started libcrun container. Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost podman[61536]: 2026-02-23 08:04:28.338427991 +0000 UTC m=+0.188526927 container init c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, build-date=2026-01-12T23:31:49Z, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Feb 23 03:04:28 localhost podman[61536]: 2026-02-23 08:04:28.350409205 +0000 UTC m=+0.200508141 container start c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtsecretd, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}) Feb 23 03:04:28 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:28 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:04:28 localhost systemd[1]: Started Session c3 of User root. Feb 23 03:04:28 localhost systemd[1]: session-c3.scope: Deactivated successfully. Feb 23 03:04:28 localhost podman[61676]: 2026-02-23 08:04:28.83016426 +0000 UTC m=+0.073164787 container create 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, batch=17.1_20260112.1, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, architecture=x86_64, build-date=2026-01-12T22:34:43Z, vcs-type=git, config_id=tripleo_step3, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public) Feb 23 03:04:28 localhost podman[61677]: 2026-02-23 08:04:28.858432397 +0000 UTC m=+0.097323942 container create 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtnodedevd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container) Feb 23 03:04:28 localhost systemd[1]: Started libpod-conmon-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.scope. Feb 23 03:04:28 localhost systemd[1]: Started libpod-conmon-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2.scope. Feb 23 03:04:28 localhost podman[61676]: 2026-02-23 08:04:28.794078063 +0000 UTC m=+0.037078550 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 23 03:04:28 localhost podman[61677]: 2026-02-23 08:04:28.795723196 +0000 UTC m=+0.034614791 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:28 localhost systemd[1]: Started libcrun container. Feb 23 03:04:28 localhost systemd[1]: Started libcrun container. Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52b3b14c7b87d61fbd3bfa894ff158a1c8322ab7dde44afc684a91162f67f067/merged/etc/target supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52b3b14c7b87d61fbd3bfa894ff158a1c8322ab7dde44afc684a91162f67f067/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:04:28 localhost podman[61676]: 2026-02-23 08:04:28.934148415 +0000 UTC m=+0.177148982 container init 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, container_name=iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 23 03:04:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:04:28 localhost podman[61677]: 2026-02-23 08:04:28.966627067 +0000 UTC m=+0.205518612 container init 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3) Feb 23 03:04:28 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:04:28 localhost podman[61676]: 2026-02-23 08:04:28.974191819 +0000 UTC m=+0.217192336 container start 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:04:28 localhost systemd[1]: Started Session c4 of User root. Feb 23 03:04:28 localhost podman[61677]: 2026-02-23 08:04:28.979774778 +0000 UTC m=+0.218666313 container start 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com) Feb 23 03:04:28 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=45772c82d00b8348e0440509154d74a9 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Feb 23 03:04:28 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:29 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:04:29 localhost systemd[1]: Started Session c5 of User root. Feb 23 03:04:29 localhost systemd[1]: session-c4.scope: Deactivated successfully. Feb 23 03:04:29 localhost podman[61716]: 2026-02-23 08:04:29.066316663 +0000 UTC m=+0.088535379 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:04:29 localhost kernel: Loading iSCSI transport class v2.0-870. Feb 23 03:04:29 localhost systemd[1]: session-c5.scope: Deactivated successfully. Feb 23 03:04:29 localhost podman[61716]: 2026-02-23 08:04:29.153773499 +0000 UTC m=+0.175992215 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1) Feb 23 03:04:29 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:04:29 localhost podman[61859]: 2026-02-23 08:04:29.566829715 +0000 UTC m=+0.087585260 container create 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_virtstoraged, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container) Feb 23 03:04:29 localhost podman[61859]: 2026-02-23 08:04:29.517855175 +0000 UTC m=+0.038610730 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:29 localhost systemd[1]: Started libpod-conmon-5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843.scope. Feb 23 03:04:29 localhost systemd[1]: Started libcrun container. Feb 23 03:04:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:29 localhost podman[61859]: 2026-02-23 08:04:29.649212766 +0000 UTC m=+0.169968311 container init 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Feb 23 03:04:29 localhost podman[61859]: 2026-02-23 08:04:29.658503295 +0000 UTC m=+0.179258840 container start 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtstoraged, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, release=1766032510, io.openshift.expose-services=) Feb 23 03:04:29 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:29 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:04:29 localhost systemd[1]: Started Session c6 of User root. Feb 23 03:04:29 localhost systemd[1]: session-c6.scope: Deactivated successfully. Feb 23 03:04:30 localhost podman[61963]: 2026-02-23 08:04:30.126584146 +0000 UTC m=+0.083417057 container create ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z) Feb 23 03:04:30 localhost systemd[1]: Started libpod-conmon-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468.scope. Feb 23 03:04:30 localhost podman[61963]: 2026-02-23 08:04:30.082637176 +0000 UTC m=+0.039470097 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:30 localhost systemd[1]: Started libcrun container. Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost podman[61963]: 2026-02-23 08:04:30.197692366 +0000 UTC m=+0.154525277 container init ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, container_name=nova_virtqemud, url=https://www.redhat.com, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, release=1766032510, io.openshift.expose-services=, tcib_managed=true) Feb 23 03:04:30 localhost podman[61963]: 2026-02-23 08:04:30.2099671 +0000 UTC m=+0.166800021 container start ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Feb 23 03:04:30 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:30 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:04:30 localhost systemd[1]: Started Session c7 of User root. Feb 23 03:04:30 localhost systemd[1]: session-c7.scope: Deactivated successfully. Feb 23 03:04:30 localhost podman[62068]: 2026-02-23 08:04:30.66399139 +0000 UTC m=+0.082519438 container create 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, container_name=nova_virtproxyd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 03:04:30 localhost systemd[1]: Started libpod-conmon-33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a.scope. Feb 23 03:04:30 localhost podman[62068]: 2026-02-23 08:04:30.618427558 +0000 UTC m=+0.036955586 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:30 localhost systemd[1]: Started libcrun container. Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:30 localhost podman[62068]: 2026-02-23 08:04:30.741161994 +0000 UTC m=+0.159689992 container init 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 03:04:30 localhost podman[62068]: 2026-02-23 08:04:30.750954709 +0000 UTC m=+0.169482717 container start 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_virtproxyd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13) Feb 23 03:04:30 localhost python3[60956]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:04:30 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:04:30 localhost systemd[1]: Started Session c8 of User root. Feb 23 03:04:30 localhost systemd[1]: session-c8.scope: Deactivated successfully. Feb 23 03:04:31 localhost python3[62150]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:31 localhost python3[62166]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:31 localhost python3[62182]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:32 localhost python3[62198]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:32 localhost python3[62214]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:32 localhost python3[62230]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:32 localhost python3[62246]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:33 localhost python3[62262]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:33 localhost python3[62278]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:33 localhost python3[62295]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:33 localhost python3[62311]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:34 localhost python3[62327]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:34 localhost python3[62343]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:34 localhost python3[62359]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:34 localhost python3[62375]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:35 localhost python3[62391]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:35 localhost python3[62407]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:35 localhost python3[62423]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:04:36 localhost python3[62484]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:37 localhost python3[62513]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:04:37 localhost systemd[1]: tmp-crun.RKrzlX.mount: Deactivated successfully. Feb 23 03:04:37 localhost podman[62543]: 2026-02-23 08:04:37.515075018 +0000 UTC m=+0.107049014 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.) Feb 23 03:04:37 localhost python3[62542]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:37 localhost podman[62543]: 2026-02-23 08:04:37.717175129 +0000 UTC m=+0.309149115 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, container_name=metrics_qdr, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 23 03:04:37 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:04:38 localhost python3[62601]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:38 localhost python3[62630]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:39 localhost python3[62659]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:39 localhost sshd[62660]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:04:39 localhost python3[62690]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:40 localhost python3[62719]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:40 localhost python3[62748]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771833875.9017687-101034-25393036279417/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:41 localhost systemd[1]: Stopping User Manager for UID 0... Feb 23 03:04:41 localhost systemd[61241]: Activating special unit Exit the Session... Feb 23 03:04:41 localhost systemd[61241]: Stopped target Main User Target. Feb 23 03:04:41 localhost systemd[61241]: Stopped target Basic System. Feb 23 03:04:41 localhost systemd[61241]: Stopped target Paths. Feb 23 03:04:41 localhost systemd[61241]: Stopped target Sockets. Feb 23 03:04:41 localhost systemd[61241]: Stopped target Timers. Feb 23 03:04:41 localhost systemd[61241]: Stopped Daily Cleanup of User's Temporary Directories. Feb 23 03:04:41 localhost systemd[61241]: Closed D-Bus User Message Bus Socket. Feb 23 03:04:41 localhost systemd[61241]: Stopped Create User's Volatile Files and Directories. Feb 23 03:04:41 localhost systemd[61241]: Removed slice User Application Slice. Feb 23 03:04:41 localhost systemd[61241]: Reached target Shutdown. Feb 23 03:04:41 localhost systemd[61241]: Finished Exit the Session. Feb 23 03:04:41 localhost systemd[61241]: Reached target Exit the Session. Feb 23 03:04:41 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 23 03:04:41 localhost systemd[1]: Stopped User Manager for UID 0. Feb 23 03:04:41 localhost python3[62764]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 03:04:41 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 23 03:04:41 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 23 03:04:41 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 23 03:04:41 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 23 03:04:41 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 23 03:04:41 localhost systemd[1]: Reloading. Feb 23 03:04:41 localhost sshd[62767]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:04:41 localhost systemd-rc-local-generator[62789]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:41 localhost systemd-sysv-generator[62795]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:41 localhost python3[62819]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:42 localhost systemd[1]: Reloading. Feb 23 03:04:42 localhost systemd-rc-local-generator[62848]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:42 localhost systemd-sysv-generator[62852]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:42 localhost systemd[1]: Starting collectd container... Feb 23 03:04:42 localhost systemd[1]: Started collectd container. Feb 23 03:04:43 localhost python3[62885]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:43 localhost systemd[1]: Reloading. Feb 23 03:04:43 localhost systemd-rc-local-generator[62912]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:43 localhost systemd-sysv-generator[62917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:43 localhost systemd[1]: Starting iscsid container... Feb 23 03:04:43 localhost systemd[1]: Started iscsid container. Feb 23 03:04:44 localhost python3[62951]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:44 localhost systemd[1]: Reloading. Feb 23 03:04:44 localhost systemd-sysv-generator[62983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:44 localhost systemd-rc-local-generator[62976]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:44 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Feb 23 03:04:44 localhost systemd[1]: Started nova_virtlogd_wrapper container. Feb 23 03:04:45 localhost python3[63017]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:45 localhost systemd[1]: Reloading. Feb 23 03:04:45 localhost systemd-sysv-generator[63048]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:45 localhost systemd-rc-local-generator[63043]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:45 localhost systemd[1]: Starting nova_virtnodedevd container... Feb 23 03:04:45 localhost tripleo-start-podman-container[63057]: Creating additional drop-in dependency for "nova_virtnodedevd" (930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2) Feb 23 03:04:45 localhost systemd[1]: Reloading. Feb 23 03:04:45 localhost systemd-sysv-generator[63119]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:45 localhost systemd-rc-local-generator[63115]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:46 localhost systemd[1]: Started nova_virtnodedevd container. Feb 23 03:04:46 localhost python3[63140]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:46 localhost systemd[1]: Reloading. Feb 23 03:04:46 localhost systemd-rc-local-generator[63166]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:46 localhost systemd-sysv-generator[63171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:47 localhost systemd[1]: Starting nova_virtproxyd container... Feb 23 03:04:47 localhost tripleo-start-podman-container[63179]: Creating additional drop-in dependency for "nova_virtproxyd" (33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a) Feb 23 03:04:47 localhost systemd[1]: Reloading. Feb 23 03:04:47 localhost systemd-rc-local-generator[63235]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:47 localhost systemd-sysv-generator[63240]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:47 localhost systemd[1]: Started nova_virtproxyd container. Feb 23 03:04:48 localhost python3[63262]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:48 localhost systemd[1]: Reloading. Feb 23 03:04:48 localhost systemd-rc-local-generator[63287]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:48 localhost systemd-sysv-generator[63293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:48 localhost systemd[1]: Starting nova_virtqemud container... Feb 23 03:04:48 localhost tripleo-start-podman-container[63302]: Creating additional drop-in dependency for "nova_virtqemud" (ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468) Feb 23 03:04:48 localhost systemd[1]: Reloading. Feb 23 03:04:48 localhost systemd-rc-local-generator[63358]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:48 localhost systemd-sysv-generator[63363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:48 localhost systemd[1]: Started nova_virtqemud container. Feb 23 03:04:49 localhost python3[63385]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:49 localhost systemd[1]: Reloading. Feb 23 03:04:49 localhost systemd-sysv-generator[63417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:49 localhost systemd-rc-local-generator[63413]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:50 localhost systemd[1]: Starting nova_virtsecretd container... Feb 23 03:04:50 localhost tripleo-start-podman-container[63425]: Creating additional drop-in dependency for "nova_virtsecretd" (c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f) Feb 23 03:04:50 localhost systemd[1]: Reloading. Feb 23 03:04:50 localhost systemd-sysv-generator[63488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:50 localhost systemd-rc-local-generator[63482]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:50 localhost systemd[1]: Started nova_virtsecretd container. Feb 23 03:04:51 localhost python3[63510]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:51 localhost systemd[1]: Reloading. Feb 23 03:04:51 localhost systemd-rc-local-generator[63535]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:51 localhost systemd-sysv-generator[63538]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:51 localhost systemd[1]: Starting nova_virtstoraged container... Feb 23 03:04:51 localhost tripleo-start-podman-container[63550]: Creating additional drop-in dependency for "nova_virtstoraged" (5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843) Feb 23 03:04:51 localhost systemd[1]: Reloading. Feb 23 03:04:51 localhost systemd-sysv-generator[63612]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:51 localhost systemd-rc-local-generator[63607]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:51 localhost systemd[1]: Started nova_virtstoraged container. Feb 23 03:04:52 localhost python3[63634]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:04:52 localhost systemd[1]: Reloading. Feb 23 03:04:52 localhost systemd-rc-local-generator[63661]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:04:52 localhost systemd-sysv-generator[63666]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:04:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:04:52 localhost systemd[1]: Starting rsyslog container... Feb 23 03:04:53 localhost systemd[1]: tmp-crun.d9cMHi.mount: Deactivated successfully. Feb 23 03:04:53 localhost systemd[1]: Started libcrun container. Feb 23 03:04:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:53 localhost podman[63674]: 2026-02-23 08:04:53.082321327 +0000 UTC m=+0.125412534 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, container_name=rsyslog, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}) Feb 23 03:04:53 localhost podman[63674]: 2026-02-23 08:04:53.092587696 +0000 UTC m=+0.135678903 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, build-date=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510) Feb 23 03:04:53 localhost podman[63674]: rsyslog Feb 23 03:04:53 localhost systemd[1]: Started rsyslog container. Feb 23 03:04:53 localhost systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully. Feb 23 03:04:53 localhost podman[63708]: 2026-02-23 08:04:53.260015056 +0000 UTC m=+0.049554321 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:04:53 localhost podman[63708]: 2026-02-23 08:04:53.286398571 +0000 UTC m=+0.075937806 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-rsyslog, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.13, build-date=2026-01-12T22:10:09Z, vcs-type=git, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=) Feb 23 03:04:53 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:04:53 localhost podman[63723]: 2026-02-23 08:04:53.373600938 +0000 UTC m=+0.062450624 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T22:10:09Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, vendor=Red Hat, Inc., container_name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Feb 23 03:04:53 localhost podman[63723]: rsyslog Feb 23 03:04:53 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 23 03:04:53 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Feb 23 03:04:53 localhost systemd[1]: Stopped rsyslog container. Feb 23 03:04:53 localhost systemd[1]: Starting rsyslog container... Feb 23 03:04:53 localhost python3[63750]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:53 localhost systemd[1]: Started libcrun container. Feb 23 03:04:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:53 localhost podman[63751]: 2026-02-23 08:04:53.668308438 +0000 UTC m=+0.094222491 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:04:53 localhost podman[63751]: 2026-02-23 08:04:53.678274249 +0000 UTC m=+0.104188322 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog) Feb 23 03:04:53 localhost podman[63751]: rsyslog Feb 23 03:04:53 localhost systemd[1]: Started rsyslog container. Feb 23 03:04:53 localhost systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully. Feb 23 03:04:53 localhost podman[63772]: 2026-02-23 08:04:53.846503873 +0000 UTC m=+0.054848610 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, com.redhat.component=openstack-rsyslog-container) Feb 23 03:04:53 localhost podman[63772]: 2026-02-23 08:04:53.872967182 +0000 UTC m=+0.081311879 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, name=rhosp-rhel9/openstack-rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step3, distribution-scope=public, release=1766032510, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:09Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog) Feb 23 03:04:53 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:04:53 localhost podman[63784]: 2026-02-23 08:04:53.959256859 +0000 UTC m=+0.059393895 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, version=17.1.13, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Feb 23 03:04:53 localhost podman[63784]: rsyslog Feb 23 03:04:53 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 23 03:04:54 localhost systemd[1]: var-lib-containers-storage-overlay-abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d-merged.mount: Deactivated successfully. Feb 23 03:04:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4-userdata-shm.mount: Deactivated successfully. Feb 23 03:04:54 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Feb 23 03:04:54 localhost systemd[1]: Stopped rsyslog container. Feb 23 03:04:54 localhost systemd[1]: Starting rsyslog container... Feb 23 03:04:54 localhost systemd[1]: Started libcrun container. Feb 23 03:04:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:54 localhost podman[63843]: 2026-02-23 08:04:54.305442031 +0000 UTC m=+0.124184493 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp-rhel9/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, container_name=rsyslog) Feb 23 03:04:54 localhost podman[63843]: 2026-02-23 08:04:54.314803031 +0000 UTC m=+0.133545503 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=rsyslog, release=1766032510, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, distribution-scope=public) Feb 23 03:04:54 localhost podman[63843]: rsyslog Feb 23 03:04:54 localhost systemd[1]: Started rsyslog container. Feb 23 03:04:54 localhost systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully. Feb 23 03:04:54 localhost podman[63877]: 2026-02-23 08:04:54.466165255 +0000 UTC m=+0.050166010 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=rsyslog, build-date=2026-01-12T22:10:09Z) Feb 23 03:04:54 localhost podman[63877]: 2026-02-23 08:04:54.486660953 +0000 UTC m=+0.070661708 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, container_name=rsyslog, architecture=x86_64, tcib_managed=true, release=1766032510, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-rsyslog) Feb 23 03:04:54 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:04:54 localhost podman[63905]: 2026-02-23 08:04:54.572452734 +0000 UTC m=+0.059589862 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com) Feb 23 03:04:54 localhost podman[63905]: rsyslog Feb 23 03:04:54 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 23 03:04:54 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Feb 23 03:04:54 localhost systemd[1]: Stopped rsyslog container. Feb 23 03:04:54 localhost systemd[1]: Starting rsyslog container... Feb 23 03:04:54 localhost systemd[1]: Started libcrun container. Feb 23 03:04:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:54 localhost podman[63937]: 2026-02-23 08:04:54.922750328 +0000 UTC m=+0.098445808 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.created=2026-01-12T22:10:09Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.5, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team) Feb 23 03:04:54 localhost podman[63937]: 2026-02-23 08:04:54.932036186 +0000 UTC m=+0.107731656 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, container_name=rsyslog, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:04:54 localhost podman[63937]: rsyslog Feb 23 03:04:54 localhost systemd[1]: Started rsyslog container. Feb 23 03:04:55 localhost systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully. Feb 23 03:04:55 localhost podman[63971]: 2026-02-23 08:04:55.095274891 +0000 UTC m=+0.057289189 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, release=1766032510, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, container_name=rsyslog, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible) Feb 23 03:04:55 localhost systemd[1]: tmp-crun.rWPFJj.mount: Deactivated successfully. Feb 23 03:04:55 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4-userdata-shm.mount: Deactivated successfully. Feb 23 03:04:55 localhost podman[63971]: 2026-02-23 08:04:55.123607089 +0000 UTC m=+0.085621337 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:09Z, container_name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, name=rhosp-rhel9/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, architecture=x86_64) Feb 23 03:04:55 localhost systemd[1]: var-lib-containers-storage-overlay-abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d-merged.mount: Deactivated successfully. Feb 23 03:04:55 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:04:55 localhost podman[64000]: 2026-02-23 08:04:55.205521316 +0000 UTC m=+0.050048666 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, vendor=Red Hat, Inc., container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-rsyslog-container, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true) Feb 23 03:04:55 localhost podman[64000]: rsyslog Feb 23 03:04:55 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 23 03:04:55 localhost python3[63998]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005626463 step=3 update_config_hash_only=False Feb 23 03:04:55 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Feb 23 03:04:55 localhost systemd[1]: Stopped rsyslog container. Feb 23 03:04:55 localhost systemd[1]: Starting rsyslog container... Feb 23 03:04:55 localhost systemd[1]: Started libcrun container. Feb 23 03:04:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Feb 23 03:04:55 localhost podman[64013]: 2026-02-23 08:04:55.672254004 +0000 UTC m=+0.101827937 container init 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-rsyslog, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, config_id=tripleo_step3, io.buildah.version=1.41.5, tcib_managed=true) Feb 23 03:04:55 localhost podman[64013]: 2026-02-23 08:04:55.681679556 +0000 UTC m=+0.111253489 container start 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:09Z, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc.) Feb 23 03:04:55 localhost podman[64013]: rsyslog Feb 23 03:04:55 localhost systemd[1]: Started rsyslog container. Feb 23 03:04:55 localhost systemd[1]: libpod-94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4.scope: Deactivated successfully. Feb 23 03:04:55 localhost podman[64037]: 2026-02-23 08:04:55.812202622 +0000 UTC m=+0.041942696 container died 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=rsyslog, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog) Feb 23 03:04:55 localhost podman[64037]: 2026-02-23 08:04:55.837837874 +0000 UTC m=+0.067577898 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, build-date=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510) Feb 23 03:04:55 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:04:55 localhost podman[64051]: 2026-02-23 08:04:55.92565702 +0000 UTC m=+0.056067079 container cleanup 94a5c22430959b0767f37ebdcadc6a54425aaa14ae15dbfff3d47acdfec747f4 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '8e5028e38f7077561ef1e3e50ec174a3'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T22:10:09Z, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, batch=17.1_20260112.1) Feb 23 03:04:55 localhost podman[64051]: rsyslog Feb 23 03:04:55 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 23 03:04:56 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Feb 23 03:04:56 localhost systemd[1]: Stopped rsyslog container. Feb 23 03:04:56 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Feb 23 03:04:56 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Feb 23 03:04:56 localhost systemd[1]: Failed to start rsyslog container. Feb 23 03:04:56 localhost python3[64079]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:04:57 localhost python3[64095]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 23 03:04:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:04:57 localhost podman[64096]: 2026-02-23 08:04:57.911385461 +0000 UTC m=+0.085181932 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_id=tripleo_step3, version=17.1.13, container_name=collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com) Feb 23 03:04:57 localhost podman[64096]: 2026-02-23 08:04:57.951235769 +0000 UTC m=+0.125032220 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13) Feb 23 03:04:57 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:04:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:04:59 localhost podman[64116]: 2026-02-23 08:04:59.89080482 +0000 UTC m=+0.070150881 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64) Feb 23 03:04:59 localhost podman[64116]: 2026-02-23 08:04:59.906175882 +0000 UTC m=+0.085521903 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3) Feb 23 03:04:59 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:05:07 localhost sshd[64211]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:05:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:05:07 localhost podman[64212]: 2026-02-23 08:05:07.92344877 +0000 UTC m=+0.092097775 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z) Feb 23 03:05:08 localhost podman[64212]: 2026-02-23 08:05:08.129450267 +0000 UTC m=+0.298099292 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=metrics_qdr) Feb 23 03:05:08 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:05:17 localhost sshd[64242]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:05:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:05:28 localhost podman[64244]: 2026-02-23 08:05:28.919113594 +0000 UTC m=+0.091210236 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:05:28 localhost podman[64244]: 2026-02-23 08:05:28.954721395 +0000 UTC m=+0.126818037 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:05:28 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:05:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:05:30 localhost podman[64265]: 2026-02-23 08:05:30.903806731 +0000 UTC m=+0.081796024 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:05:30 localhost podman[64265]: 2026-02-23 08:05:30.91217636 +0000 UTC m=+0.090165623 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, vcs-type=git, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., tcib_managed=true) Feb 23 03:05:30 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:05:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:05:38 localhost podman[64286]: 2026-02-23 08:05:38.900339024 +0000 UTC m=+0.069875562 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, version=17.1.13) Feb 23 03:05:39 localhost podman[64286]: 2026-02-23 08:05:39.102236448 +0000 UTC m=+0.271772976 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 23 03:05:39 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:05:44 localhost sshd[64315]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:05:56 localhost sshd[64317]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:05:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:05:59 localhost podman[64319]: 2026-02-23 08:05:59.90731125 +0000 UTC m=+0.083071574 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true) Feb 23 03:05:59 localhost podman[64319]: 2026-02-23 08:05:59.921304258 +0000 UTC m=+0.097064602 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, build-date=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, url=https://www.redhat.com) Feb 23 03:05:59 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:06:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:06:01 localhost systemd[1]: tmp-crun.nd9wS2.mount: Deactivated successfully. Feb 23 03:06:01 localhost podman[64340]: 2026-02-23 08:06:01.907733652 +0000 UTC m=+0.083600302 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git) Feb 23 03:06:01 localhost podman[64340]: 2026-02-23 08:06:01.91918183 +0000 UTC m=+0.095048500 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.expose-services=, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64) Feb 23 03:06:01 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:06:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:06:09 localhost podman[64435]: 2026-02-23 08:06:09.907283421 +0000 UTC m=+0.080644247 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:06:10 localhost podman[64435]: 2026-02-23 08:06:10.109416943 +0000 UTC m=+0.282777739 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510) Feb 23 03:06:10 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:06:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:06:30 localhost podman[64465]: 2026-02-23 08:06:30.913790052 +0000 UTC m=+0.087031962 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, container_name=collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5) Feb 23 03:06:30 localhost podman[64465]: 2026-02-23 08:06:30.923605056 +0000 UTC m=+0.096847016 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git) Feb 23 03:06:30 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:06:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:06:32 localhost systemd[1]: tmp-crun.sE8eI4.mount: Deactivated successfully. Feb 23 03:06:32 localhost podman[64486]: 2026-02-23 08:06:32.913341945 +0000 UTC m=+0.088608032 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:06:32 localhost podman[64486]: 2026-02-23 08:06:32.952363517 +0000 UTC m=+0.127629604 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, release=1766032510, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:06:32 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:06:37 localhost sshd[64506]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:06:39 localhost sshd[64508]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:06:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:06:40 localhost systemd[1]: tmp-crun.QuEhI8.mount: Deactivated successfully. Feb 23 03:06:40 localhost podman[64510]: 2026-02-23 08:06:40.910813735 +0000 UTC m=+0.085533121 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, version=17.1.13, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container) Feb 23 03:06:41 localhost podman[64510]: 2026-02-23 08:06:41.119375215 +0000 UTC m=+0.294094591 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13) Feb 23 03:06:41 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:07:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:07:01 localhost systemd[1]: tmp-crun.ZE94jW.mount: Deactivated successfully. Feb 23 03:07:01 localhost podman[64539]: 2026-02-23 08:07:01.921593648 +0000 UTC m=+0.094842492 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_id=tripleo_step3, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd) Feb 23 03:07:01 localhost podman[64539]: 2026-02-23 08:07:01.933020756 +0000 UTC m=+0.106269630 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git) Feb 23 03:07:01 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:07:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:07:03 localhost podman[64558]: 2026-02-23 08:07:03.910694376 +0000 UTC m=+0.080996175 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, distribution-scope=public) Feb 23 03:07:03 localhost podman[64558]: 2026-02-23 08:07:03.945482609 +0000 UTC m=+0.115784478 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, architecture=x86_64, com.redhat.component=openstack-iscsid-container, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:07:03 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:07:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:07:11 localhost podman[64639]: 2026-02-23 08:07:11.913983285 +0000 UTC m=+0.083954230 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 23 03:07:12 localhost podman[64639]: 2026-02-23 08:07:12.131583927 +0000 UTC m=+0.301554892 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:07:12 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:07:17 localhost sshd[64683]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:07:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:07:32 localhost podman[64685]: 2026-02-23 08:07:32.913934248 +0000 UTC m=+0.088100263 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.) Feb 23 03:07:32 localhost podman[64685]: 2026-02-23 08:07:32.952282376 +0000 UTC m=+0.126448411 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, release=1766032510, container_name=collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_id=tripleo_step3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com) Feb 23 03:07:32 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:07:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:07:34 localhost podman[64706]: 2026-02-23 08:07:34.905547269 +0000 UTC m=+0.082106770 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 23 03:07:34 localhost podman[64706]: 2026-02-23 08:07:34.944277999 +0000 UTC m=+0.120837490 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, version=17.1.13, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container, container_name=iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3) Feb 23 03:07:34 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:07:35 localhost sshd[64726]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:07:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:07:42 localhost podman[64728]: 2026-02-23 08:07:42.917150438 +0000 UTC m=+0.095209683 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:07:43 localhost podman[64728]: 2026-02-23 08:07:43.14841286 +0000 UTC m=+0.326472115 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, tcib_managed=true, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container) Feb 23 03:07:43 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:07:59 localhost sshd[64758]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:08:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:08:03 localhost podman[64760]: 2026-02-23 08:08:03.913445782 +0000 UTC m=+0.082886425 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, distribution-scope=public, vcs-type=git, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com) Feb 23 03:08:03 localhost podman[64760]: 2026-02-23 08:08:03.921782811 +0000 UTC m=+0.091223464 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=collectd, tcib_managed=true, url=https://www.redhat.com) Feb 23 03:08:03 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:08:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:08:05 localhost podman[64780]: 2026-02-23 08:08:05.907349796 +0000 UTC m=+0.083720373 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc.) Feb 23 03:08:05 localhost podman[64780]: 2026-02-23 08:08:05.920259482 +0000 UTC m=+0.096630019 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:08:05 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:08:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:08:13 localhost podman[64878]: 2026-02-23 08:08:13.897016646 +0000 UTC m=+0.094600644 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 03:08:14 localhost podman[64878]: 2026-02-23 08:08:14.094145296 +0000 UTC m=+0.291729204 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, distribution-scope=public, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 23 03:08:14 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:08:30 localhost sshd[64957]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:08:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:08:34 localhost systemd[1]: tmp-crun.uwi1pF.mount: Deactivated successfully. Feb 23 03:08:34 localhost podman[64959]: 2026-02-23 08:08:34.910585636 +0000 UTC m=+0.080133767 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:10:15Z, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container) Feb 23 03:08:34 localhost podman[64959]: 2026-02-23 08:08:34.922722887 +0000 UTC m=+0.092271038 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:10:15Z, container_name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true) Feb 23 03:08:34 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:08:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:08:36 localhost podman[64980]: 2026-02-23 08:08:36.916762856 +0000 UTC m=+0.093009262 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:08:36 localhost podman[64980]: 2026-02-23 08:08:36.952366595 +0000 UTC m=+0.128613031 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:08:36 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:08:40 localhost sshd[64999]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:08:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:08:44 localhost systemd[1]: tmp-crun.3ykbQh.mount: Deactivated successfully. Feb 23 03:08:44 localhost podman[65001]: 2026-02-23 08:08:44.912745099 +0000 UTC m=+0.085795067 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, release=1766032510, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr) Feb 23 03:08:45 localhost podman[65001]: 2026-02-23 08:08:45.105911147 +0000 UTC m=+0.278961075 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step1, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Feb 23 03:08:45 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:08:50 localhost python3[65078]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:08:50 localhost python3[65123]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834130.036081-107933-2862644198044/source _original_basename=tmp3dlyj6vc follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:08:52 localhost python3[65185]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:08:52 localhost python3[65228]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834131.7460716-108020-175022870004861/source _original_basename=tmpgzuvoap0 follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:08:53 localhost python3[65290]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:08:53 localhost python3[65333]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834132.7300253-108074-181335551316782/source _original_basename=tmpiy7e7gqm follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:08:54 localhost python3[65395]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:08:54 localhost python3[65438]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834133.7447448-108147-246640380453297/source _original_basename=tmpm63ch93t follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:08:55 localhost python3[65468]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 23 03:08:55 localhost systemd[1]: Reloading. Feb 23 03:08:55 localhost systemd-rc-local-generator[65489]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:08:55 localhost systemd-sysv-generator[65493]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:08:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:08:55 localhost systemd[1]: Reloading. Feb 23 03:08:55 localhost systemd-sysv-generator[65535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:08:55 localhost systemd-rc-local-generator[65529]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:08:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:08:56 localhost python3[65558]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:08:56 localhost systemd[1]: Reloading. Feb 23 03:08:56 localhost systemd-rc-local-generator[65580]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:08:56 localhost systemd-sysv-generator[65586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:08:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:08:56 localhost systemd[1]: Reloading. Feb 23 03:08:56 localhost systemd-rc-local-generator[65621]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:08:56 localhost systemd-sysv-generator[65627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:08:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:08:56 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Feb 23 03:08:57 localhost python3[65649]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 03:08:57 localhost systemd[1]: Reloading. Feb 23 03:08:57 localhost systemd-sysv-generator[65677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:08:57 localhost systemd-rc-local-generator[65674]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:08:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:08:58 localhost python3[65734]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:08:58 localhost python3[65777]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834137.842005-108308-8662771612812/source _original_basename=tmpdo6uhayd follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:08:59 localhost python3[65807]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:08:59 localhost systemd[1]: Reloading. Feb 23 03:08:59 localhost systemd-sysv-generator[65837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:08:59 localhost systemd-rc-local-generator[65834]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:08:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:08:59 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Feb 23 03:08:59 localhost python3[65862]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:09:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 4530 writes, 20K keys, 4530 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4530 writes, 464 syncs, 9.76 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 215 writes, 495 keys, 215 commit groups, 1.0 writes per commit group, ingest: 0.42 MB, 0.00 MB/s#012Interval WAL: 215 writes, 106 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:09:01 localhost ansible-async_wrapper.py[66034]: Invoked with 133668917751 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834140.9264715-108392-75319085338427/AnsiballZ_command.py _ Feb 23 03:09:01 localhost ansible-async_wrapper.py[66037]: Starting module and watcher Feb 23 03:09:01 localhost ansible-async_wrapper.py[66037]: Start watching 66038 (3600) Feb 23 03:09:01 localhost ansible-async_wrapper.py[66038]: Start module (66038) Feb 23 03:09:01 localhost ansible-async_wrapper.py[66034]: Return async_wrapper task started. Feb 23 03:09:01 localhost python3[66058]: ansible-ansible.legacy.async_status Invoked with jid=133668917751.66034 mode=status _async_dir=/tmp/.ansible_async Feb 23 03:09:05 localhost puppet-user[66045]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 03:09:05 localhost puppet-user[66045]: (file: /etc/puppet/hiera.yaml) Feb 23 03:09:05 localhost puppet-user[66045]: Warning: Undefined variable '::deploy_config_name'; Feb 23 03:09:05 localhost puppet-user[66045]: (file & line not available) Feb 23 03:09:05 localhost puppet-user[66045]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 03:09:05 localhost puppet-user[66045]: (file & line not available) Feb 23 03:09:05 localhost puppet-user[66045]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 23 03:09:05 localhost puppet-user[66045]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:09:05 localhost puppet-user[66045]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:09:05 localhost puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:09:05 localhost puppet-user[66045]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:09:05 localhost puppet-user[66045]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:09:05 localhost puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:09:05 localhost puppet-user[66045]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:09:05 localhost puppet-user[66045]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:09:05 localhost puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:09:05 localhost puppet-user[66045]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:09:05 localhost puppet-user[66045]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:09:05 localhost puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:09:05 localhost puppet-user[66045]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:09:05 localhost puppet-user[66045]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:09:05 localhost puppet-user[66045]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:09:05 localhost puppet-user[66045]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:09:05 localhost puppet-user[66045]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:09:05 localhost puppet-user[66045]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 23 03:09:05 localhost puppet-user[66045]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.23 seconds Feb 23 03:09:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:09:05 localhost podman[66174]: 2026-02-23 08:09:05.929717017 +0000 UTC m=+0.096912086 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1766032510, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true) Feb 23 03:09:05 localhost podman[66174]: 2026-02-23 08:09:05.943481846 +0000 UTC m=+0.110676955 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd) Feb 23 03:09:05 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:09:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:09:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 5013 writes, 22K keys, 5013 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5013 writes, 561 syncs, 8.94 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 146 writes, 382 keys, 146 commit groups, 1.0 writes per commit group, ingest: 0.34 MB, 0.00 MB/s#012Interval WAL: 146 writes, 72 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:09:06 localhost ansible-async_wrapper.py[66037]: 66038 still running (3600) Feb 23 03:09:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:09:07 localhost systemd[1]: tmp-crun.kjmOyP.mount: Deactivated successfully. Feb 23 03:09:07 localhost podman[66197]: 2026-02-23 08:09:07.894991537 +0000 UTC m=+0.076699024 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:09:07 localhost podman[66197]: 2026-02-23 08:09:07.903818592 +0000 UTC m=+0.085526109 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:09:07 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:09:11 localhost ansible-async_wrapper.py[66037]: 66038 still running (3595) Feb 23 03:09:12 localhost python3[66286]: ansible-ansible.legacy.async_status Invoked with jid=133668917751.66034 mode=status _async_dir=/tmp/.ansible_async Feb 23 03:09:14 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 03:09:14 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 03:09:14 localhost systemd[1]: Reloading. Feb 23 03:09:14 localhost systemd-sysv-generator[66393]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:14 localhost systemd-rc-local-generator[66388]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:14 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 03:09:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:09:15 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 03:09:15 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 03:09:15 localhost systemd[1]: man-db-cache-update.service: Consumed 1.282s CPU time. Feb 23 03:09:15 localhost systemd[1]: run-r66cda147f9a64a7fb4a68f0618c030d7.service: Deactivated successfully. Feb 23 03:09:15 localhost systemd[1]: tmp-crun.tbCC8h.mount: Deactivated successfully. Feb 23 03:09:15 localhost podman[67389]: 2026-02-23 08:09:15.578128328 +0000 UTC m=+0.096653637 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5) Feb 23 03:09:15 localhost podman[67389]: 2026-02-23 08:09:15.797503793 +0000 UTC m=+0.316029162 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 23 03:09:15 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:09:16 localhost puppet-user[66045]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Feb 23 03:09:16 localhost puppet-user[66045]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}aa026ec7dfd630c335e361683a845a48dbc161e201acd6e8ba3a46c1ecc947e6' Feb 23 03:09:16 localhost puppet-user[66045]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Feb 23 03:09:16 localhost puppet-user[66045]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Feb 23 03:09:16 localhost puppet-user[66045]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Feb 23 03:09:16 localhost puppet-user[66045]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Feb 23 03:09:16 localhost ansible-async_wrapper.py[66037]: 66038 still running (3590) Feb 23 03:09:21 localhost podman[67612]: Feb 23 03:09:21 localhost puppet-user[66045]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Feb 23 03:09:21 localhost podman[67612]: 2026-02-23 08:09:21.275378476 +0000 UTC m=+0.081148303 container create 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, version=7, ceph=True, distribution-scope=public) Feb 23 03:09:21 localhost sshd[67627]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:09:21 localhost systemd[1]: Started libpod-conmon-16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5.scope. Feb 23 03:09:21 localhost podman[67612]: 2026-02-23 08:09:21.242655225 +0000 UTC m=+0.048425082 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 03:09:21 localhost systemd[1]: Started libcrun container. Feb 23 03:09:21 localhost podman[67612]: 2026-02-23 08:09:21.360194972 +0000 UTC m=+0.165964799 container init 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=1770267347, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, architecture=x86_64, distribution-scope=public, RELEASE=main) Feb 23 03:09:21 localhost podman[67612]: 2026-02-23 08:09:21.372227257 +0000 UTC m=+0.177997104 container start 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, GIT_CLEAN=True, vcs-type=git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 03:09:21 localhost podman[67612]: 2026-02-23 08:09:21.372524747 +0000 UTC m=+0.178294574 container attach 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64) Feb 23 03:09:21 localhost stupefied_turing[67632]: 167 167 Feb 23 03:09:21 localhost systemd[1]: libpod-16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5.scope: Deactivated successfully. Feb 23 03:09:21 localhost podman[67612]: 2026-02-23 08:09:21.37552639 +0000 UTC m=+0.181296247 container died 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, maintainer=Guillaume Abrioux , release=1770267347, CEPH_POINT_RELEASE=, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, RELEASE=main, name=rhceph, vendor=Red Hat, Inc.) Feb 23 03:09:21 localhost systemd[1]: Reloading. Feb 23 03:09:21 localhost systemd-sysv-generator[67680]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:21 localhost systemd-rc-local-generator[67673]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:21 localhost ansible-async_wrapper.py[66037]: 66038 still running (3585) Feb 23 03:09:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:21 localhost podman[67639]: 2026-02-23 08:09:21.747362883 +0000 UTC m=+0.354144431 container remove 16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_turing, maintainer=Guillaume Abrioux , architecture=x86_64, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.openshift.expose-services=, name=rhceph, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main) Feb 23 03:09:21 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Feb 23 03:09:21 localhost systemd[1]: libpod-conmon-16803b33e6478136b4d9c01d5ba933ddc5dc797998af1cff9b5a3e5d32d9b6e5.scope: Deactivated successfully. Feb 23 03:09:21 localhost snmpd[67690]: Can't find directory of RPM packages Feb 23 03:09:21 localhost snmpd[67690]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Feb 23 03:09:21 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Feb 23 03:09:21 localhost systemd[1]: Reloading. Feb 23 03:09:21 localhost podman[67700]: Feb 23 03:09:21 localhost podman[67700]: 2026-02-23 08:09:21.989955172 +0000 UTC m=+0.085918522 container create 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git) Feb 23 03:09:21 localhost systemd-rc-local-generator[67735]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:21 localhost systemd-sysv-generator[67740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:22 localhost podman[67700]: 2026-02-23 08:09:21.953552466 +0000 UTC m=+0.049515846 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 03:09:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:22 localhost systemd[1]: var-lib-containers-storage-overlay-fade5e80820262dc4cd0d76f1c0aef2432ea8e922812ec0df05298c03d71e744-merged.mount: Deactivated successfully. Feb 23 03:09:22 localhost systemd[1]: Started libpod-conmon-7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a.scope. Feb 23 03:09:22 localhost systemd[1]: Reloading. Feb 23 03:09:22 localhost systemd-rc-local-generator[67776]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:22 localhost systemd-sysv-generator[67780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:22 localhost puppet-user[66045]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Feb 23 03:09:22 localhost systemd[1]: Started libcrun container. Feb 23 03:09:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b623fcab808d66d09a5f747fe60e1e62189817e8e088d3cf19e4817793f01314/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b623fcab808d66d09a5f747fe60e1e62189817e8e088d3cf19e4817793f01314/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b623fcab808d66d09a5f747fe60e1e62189817e8e088d3cf19e4817793f01314/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:22 localhost puppet-user[66045]: Notice: Applied catalog in 16.95 seconds Feb 23 03:09:22 localhost puppet-user[66045]: Application: Feb 23 03:09:22 localhost puppet-user[66045]: Initial environment: production Feb 23 03:09:22 localhost puppet-user[66045]: Converged environment: production Feb 23 03:09:22 localhost puppet-user[66045]: Run mode: user Feb 23 03:09:22 localhost puppet-user[66045]: Changes: Feb 23 03:09:22 localhost puppet-user[66045]: Total: 8 Feb 23 03:09:22 localhost puppet-user[66045]: Events: Feb 23 03:09:22 localhost puppet-user[66045]: Success: 8 Feb 23 03:09:22 localhost puppet-user[66045]: Total: 8 Feb 23 03:09:22 localhost puppet-user[66045]: Resources: Feb 23 03:09:22 localhost puppet-user[66045]: Restarted: 1 Feb 23 03:09:22 localhost puppet-user[66045]: Changed: 8 Feb 23 03:09:22 localhost puppet-user[66045]: Out of sync: 8 Feb 23 03:09:22 localhost puppet-user[66045]: Total: 19 Feb 23 03:09:22 localhost puppet-user[66045]: Time: Feb 23 03:09:22 localhost puppet-user[66045]: Filebucket: 0.00 Feb 23 03:09:22 localhost puppet-user[66045]: Schedule: 0.00 Feb 23 03:09:22 localhost puppet-user[66045]: Augeas: 0.01 Feb 23 03:09:22 localhost puppet-user[66045]: File: 0.10 Feb 23 03:09:22 localhost puppet-user[66045]: Config retrieval: 0.30 Feb 23 03:09:22 localhost puppet-user[66045]: Service: 1.30 Feb 23 03:09:22 localhost puppet-user[66045]: Package: 10.21 Feb 23 03:09:22 localhost puppet-user[66045]: Transaction evaluation: 16.94 Feb 23 03:09:22 localhost puppet-user[66045]: Catalog application: 16.95 Feb 23 03:09:22 localhost puppet-user[66045]: Last run: 1771834162 Feb 23 03:09:22 localhost puppet-user[66045]: Exec: 5.10 Feb 23 03:09:22 localhost puppet-user[66045]: Total: 16.95 Feb 23 03:09:22 localhost puppet-user[66045]: Version: Feb 23 03:09:22 localhost puppet-user[66045]: Config: 1771834145 Feb 23 03:09:22 localhost puppet-user[66045]: Puppet: 7.10.0 Feb 23 03:09:22 localhost podman[67700]: 2026-02-23 08:09:22.541994817 +0000 UTC m=+0.637958187 container init 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1770267347, architecture=x86_64, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 03:09:22 localhost systemd[1]: tmp-crun.BES9Lf.mount: Deactivated successfully. Feb 23 03:09:22 localhost podman[67700]: 2026-02-23 08:09:22.559933006 +0000 UTC m=+0.655896366 container start 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, name=rhceph, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, architecture=x86_64, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Feb 23 03:09:22 localhost podman[67700]: 2026-02-23 08:09:22.560207245 +0000 UTC m=+0.656170605 container attach 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True) Feb 23 03:09:22 localhost ansible-async_wrapper.py[66038]: Module complete (66038) Feb 23 03:09:22 localhost python3[67804]: ansible-ansible.legacy.async_status Invoked with jid=133668917751.66034 mode=status _async_dir=/tmp/.ansible_async Feb 23 03:09:23 localhost python3[68340]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: [ Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: { Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "available": false, Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "ceph_device": false, Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "lsm_data": {}, Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "lvs": [], Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "path": "/dev/sr0", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "rejected_reasons": [ Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "Has a FileSystem", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "Insufficient space (<5GB)" Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: ], Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "sys_api": { Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "actuators": null, Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "device_nodes": "sr0", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "human_readable_size": "482.00 KB", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "id_bus": "ata", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "model": "QEMU DVD-ROM", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "nr_requests": "2", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "partitions": {}, Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "path": "/dev/sr0", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "removable": "1", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "rev": "2.5+", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "ro": "0", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "rotational": "1", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "sas_address": "", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "sas_device_handle": "", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "scheduler_mode": "mq-deadline", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "sectors": 0, Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "sectorsize": "2048", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "size": 493568.0, Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "support_discard": "0", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "type": "disk", Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: "vendor": "QEMU" Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: } Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: } Feb 23 03:09:23 localhost xenodochial_rhodes[67752]: ] Feb 23 03:09:23 localhost systemd[1]: libpod-7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a.scope: Deactivated successfully. Feb 23 03:09:23 localhost podman[67700]: 2026-02-23 08:09:23.516760662 +0000 UTC m=+1.612724002 container died 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, vcs-type=git, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, build-date=2026-02-09T10:25:24Z) Feb 23 03:09:23 localhost systemd[1]: var-lib-containers-storage-overlay-b623fcab808d66d09a5f747fe60e1e62189817e8e088d3cf19e4817793f01314-merged.mount: Deactivated successfully. Feb 23 03:09:23 localhost podman[69387]: 2026-02-23 08:09:23.60741468 +0000 UTC m=+0.082665520 container remove 7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_rhodes, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-type=git, release=1770267347, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:09:23 localhost systemd[1]: libpod-conmon-7f6a1be6e3f428dd2bd6810198d4774605a62d37052bd0779a7ae2bd22983b0a.scope: Deactivated successfully. Feb 23 03:09:23 localhost python3[69415]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:24 localhost python3[69480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:09:24 localhost python3[69498]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpxjnmgbdj recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 03:09:25 localhost python3[69528]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:26 localhost python3[69631]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 23 03:09:26 localhost ansible-async_wrapper.py[66037]: Done in kid B. Feb 23 03:09:26 localhost python3[69650]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:27 localhost python3[69682]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:28 localhost sshd[69722]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:09:28 localhost python3[69733]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:09:28 localhost python3[69752]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:29 localhost python3[69814]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:09:29 localhost python3[69832]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:30 localhost python3[69895]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:09:30 localhost python3[69913]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:31 localhost python3[69975]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:09:31 localhost python3[69993]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:31 localhost python3[70023]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:09:31 localhost systemd[1]: Reloading. Feb 23 03:09:32 localhost systemd-rc-local-generator[70049]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:32 localhost systemd-sysv-generator[70053]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:33 localhost python3[70109]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:09:34 localhost python3[70127]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:34 localhost python3[70189]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:09:35 localhost python3[70207]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:09:36 localhost systemd[1]: tmp-crun.7s4xhL.mount: Deactivated successfully. Feb 23 03:09:36 localhost podman[70238]: 2026-02-23 08:09:36.398908224 +0000 UTC m=+0.100579789 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:09:36 localhost podman[70238]: 2026-02-23 08:09:36.409553786 +0000 UTC m=+0.111225291 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:09:36 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:09:36 localhost python3[70237]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:09:36 localhost systemd[1]: Reloading. Feb 23 03:09:36 localhost systemd-sysv-generator[70283]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:36 localhost systemd-rc-local-generator[70278]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:36 localhost systemd[1]: Starting Create netns directory... Feb 23 03:09:36 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 03:09:36 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 03:09:36 localhost systemd[1]: Finished Create netns directory. Feb 23 03:09:37 localhost python3[70314]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 23 03:09:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:09:38 localhost podman[70357]: 2026-02-23 08:09:38.913104663 +0000 UTC m=+0.080967038 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:09:38 localhost podman[70357]: 2026-02-23 08:09:38.924749816 +0000 UTC m=+0.092612231 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Feb 23 03:09:38 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:09:39 localhost python3[70392]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 23 03:09:39 localhost podman[70562]: 2026-02-23 08:09:39.807662375 +0000 UTC m=+0.088820643 container create b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, architecture=x86_64, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, release=1766032510, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:09:39 localhost podman[70548]: 2026-02-23 08:09:39.830458406 +0000 UTC m=+0.130459941 container create daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, build-date=2026-01-12T23:31:49Z, container_name=nova_libvirt_init_secret, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:09:39 localhost podman[70559]: 2026-02-23 08:09:39.753293609 +0000 UTC m=+0.034885670 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 23 03:09:39 localhost podman[70604]: 2026-02-23 08:09:39.858122549 +0000 UTC m=+0.088603466 container create 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, url=https://www.redhat.com) Feb 23 03:09:39 localhost systemd[1]: Started libpod-conmon-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.scope. Feb 23 03:09:39 localhost podman[70548]: 2026-02-23 08:09:39.76394818 +0000 UTC m=+0.063949735 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Feb 23 03:09:39 localhost systemd[1]: Started libpod-conmon-daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d.scope. Feb 23 03:09:39 localhost podman[70562]: 2026-02-23 08:09:39.766075347 +0000 UTC m=+0.047233625 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 23 03:09:39 localhost podman[70559]: 2026-02-23 08:09:39.867684577 +0000 UTC m=+0.149276628 container create 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, container_name=configure_cms_options, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:36:40Z, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:09:39 localhost systemd[1]: Started libcrun container. Feb 23 03:09:39 localhost systemd[1]: Started libcrun container. Feb 23 03:09:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde/merged/etc/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2c770567d2f47629c218ae90d489529d9f3e3ed2618072d59a3365c20854653/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:39 localhost podman[70585]: 2026-02-23 08:09:39.789813338 +0000 UTC m=+0.045221733 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 23 03:09:39 localhost systemd[1]: Started libpod-conmon-59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd.scope. Feb 23 03:09:39 localhost podman[70548]: 2026-02-23 08:09:39.892480831 +0000 UTC m=+0.192482356 container init daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=nova_libvirt_init_secret, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:09:39 localhost systemd[1]: Started libcrun container. Feb 23 03:09:39 localhost podman[70604]: 2026-02-23 08:09:39.802329108 +0000 UTC m=+0.032810035 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 23 03:09:39 localhost podman[70548]: 2026-02-23 08:09:39.903644699 +0000 UTC m=+0.203646234 container start daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=nova_libvirt_init_secret, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true) Feb 23 03:09:39 localhost podman[70548]: 2026-02-23 08:09:39.914971913 +0000 UTC m=+0.214973428 container attach daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, container_name=nova_libvirt_init_secret, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5) Feb 23 03:09:39 localhost podman[70585]: 2026-02-23 08:09:39.916103089 +0000 UTC m=+0.171511474 container create 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4) Feb 23 03:09:39 localhost systemd[1]: Started libpod-conmon-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.scope. Feb 23 03:09:39 localhost systemd[1]: Started libcrun container. Feb 23 03:09:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85b58d6db47c08b0dc415e7676af05b270f66bddea6d8ca4f2d3998d7b04080d/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:09:39 localhost podman[70562]: 2026-02-23 08:09:39.968555815 +0000 UTC m=+0.249714093 container init b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, container_name=logrotate_crond, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:09:39 localhost podman[70559]: 2026-02-23 08:09:39.970344771 +0000 UTC m=+0.251936812 container init 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=configure_cms_options, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc.) Feb 23 03:09:39 localhost podman[70559]: 2026-02-23 08:09:39.977936828 +0000 UTC m=+0.259528859 container start 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, container_name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Feb 23 03:09:39 localhost podman[70559]: 2026-02-23 08:09:39.978262988 +0000 UTC m=+0.259855059 container attach 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, config_id=tripleo_step4, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=configure_cms_options, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:09:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:09:40 localhost podman[70562]: 2026-02-23 08:09:40.004382813 +0000 UTC m=+0.285541081 container start b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, version=17.1.13, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:09:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:09:40 localhost podman[70585]: 2026-02-23 08:09:40.010952748 +0000 UTC m=+0.266361123 container init 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git) Feb 23 03:09:40 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Feb 23 03:09:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:09:40 localhost systemd[1]: libpod-daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d.scope: Deactivated successfully. Feb 23 03:09:40 localhost podman[70585]: 2026-02-23 08:09:40.050009217 +0000 UTC m=+0.305417602 container start 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:09:40 localhost podman[70548]: 2026-02-23 08:09:40.051223754 +0000 UTC m=+0.351225279 container died daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_libvirt_init_secret, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 03:09:40 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=44281c742f88411d75916a4e58499720 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Feb 23 03:09:40 localhost systemd[1]: Started libpod-conmon-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.scope. Feb 23 03:09:40 localhost ovs-vsctl[70713]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Feb 23 03:09:40 localhost podman[70559]: 2026-02-23 08:09:40.127330997 +0000 UTC m=+0.408923028 container died 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ovn-controller-container) Feb 23 03:09:40 localhost systemd[1]: Started libcrun container. Feb 23 03:09:40 localhost systemd[1]: libpod-59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd.scope: Deactivated successfully. Feb 23 03:09:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51915910ced93426f00f1704499e6c4900ce6f68bf275b1a1584b9abaa73dcbc/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:40 localhost podman[70665]: 2026-02-23 08:09:40.145929448 +0000 UTC m=+0.127113386 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, distribution-scope=public, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 23 03:09:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:09:40 localhost podman[70604]: 2026-02-23 08:09:40.155064023 +0000 UTC m=+0.385544950 container init 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:09:40 localhost podman[70665]: 2026-02-23 08:09:40.15528846 +0000 UTC m=+0.136472408 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13) Feb 23 03:09:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:09:40 localhost podman[70604]: 2026-02-23 08:09:40.183855152 +0000 UTC m=+0.414336069 container start 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 03:09:40 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=44281c742f88411d75916a4e58499720 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Feb 23 03:09:40 localhost podman[70697]: 2026-02-23 08:09:40.187248258 +0000 UTC m=+0.124850156 container cleanup daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_libvirt_init_secret, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:31:49Z, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible) Feb 23 03:09:40 localhost systemd[1]: libpod-conmon-daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d.scope: Deactivated successfully. Feb 23 03:09:40 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack Feb 23 03:09:40 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:09:40 localhost podman[70687]: 2026-02-23 08:09:40.277216276 +0000 UTC m=+0.228760929 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 23 03:09:40 localhost podman[70727]: 2026-02-23 08:09:40.299971206 +0000 UTC m=+0.157498356 container cleanup 59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, container_name=configure_cms_options, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4) Feb 23 03:09:40 localhost systemd[1]: libpod-conmon-59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd.scope: Deactivated successfully. Feb 23 03:09:40 localhost podman[70760]: 2026-02-23 08:09:40.262117745 +0000 UTC m=+0.073825905 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Feb 23 03:09:40 localhost podman[70760]: 2026-02-23 08:09:40.346144906 +0000 UTC m=+0.157853076 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:09:40 localhost podman[70760]: unhealthy Feb 23 03:09:40 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:09:40 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed with result 'exit-code'. Feb 23 03:09:40 localhost podman[70687]: 2026-02-23 08:09:40.361123164 +0000 UTC m=+0.312667827 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.expose-services=) Feb 23 03:09:40 localhost podman[70687]: unhealthy Feb 23 03:09:40 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Feb 23 03:09:40 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:09:40 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed with result 'exit-code'. Feb 23 03:09:40 localhost podman[70934]: 2026-02-23 08:09:40.693629479 +0000 UTC m=+0.085251091 container create 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=setup_ovs_manager) Feb 23 03:09:40 localhost systemd[1]: Started libpod-conmon-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83.scope. Feb 23 03:09:40 localhost systemd[1]: Started libcrun container. Feb 23 03:09:40 localhost podman[70934]: 2026-02-23 08:09:40.648026076 +0000 UTC m=+0.039647748 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 23 03:09:40 localhost podman[70934]: 2026-02-23 08:09:40.768734562 +0000 UTC m=+0.160356194 container init 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=setup_ovs_manager, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1) Feb 23 03:09:40 localhost podman[70934]: 2026-02-23 08:09:40.780408576 +0000 UTC m=+0.172030208 container start 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=setup_ovs_manager, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, build-date=2026-01-12T22:56:19Z, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Feb 23 03:09:40 localhost podman[70934]: 2026-02-23 08:09:40.780658864 +0000 UTC m=+0.172280496 container attach 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=setup_ovs_manager, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com) Feb 23 03:09:40 localhost podman[70953]: 2026-02-23 08:09:40.810723793 +0000 UTC m=+0.151414506 container create 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Feb 23 03:09:40 localhost systemd[1]: Started libpod-conmon-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.scope. Feb 23 03:09:40 localhost podman[70953]: 2026-02-23 08:09:40.753398384 +0000 UTC m=+0.094089117 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 03:09:40 localhost systemd[1]: Started libcrun container. Feb 23 03:09:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4752b195a0319d00ad3a4bd86f4312afcec268e914950a9934c95f1e8044f1fa/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:09:40 localhost podman[70953]: 2026-02-23 08:09:40.904046364 +0000 UTC m=+0.244737077 container init 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, tcib_managed=true) Feb 23 03:09:40 localhost systemd[1]: var-lib-containers-storage-overlay-025f13926bcfeedaf7e085dde432a5009541a3e067b7031c72f0d516a81ad107-merged.mount: Deactivated successfully. Feb 23 03:09:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59486ce627fd459b6e97a4f486334ca248af41ccabbc9b8b32e2d2fa128e6dfd-userdata-shm.mount: Deactivated successfully. Feb 23 03:09:40 localhost systemd[1]: var-lib-containers-storage-overlay-e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde-merged.mount: Deactivated successfully. Feb 23 03:09:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-daefcff5bb86c96499bf95810fc86a5c166aff728ac3ee427f8a688f0b693d1d-userdata-shm.mount: Deactivated successfully. Feb 23 03:09:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:09:40 localhost podman[70953]: 2026-02-23 08:09:40.966739061 +0000 UTC m=+0.307429754 container start 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:09:40 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 03:09:41 localhost podman[70989]: 2026-02-23 08:09:41.113327864 +0000 UTC m=+0.137848402 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:09:41 localhost podman[70989]: 2026-02-23 08:09:41.475932519 +0000 UTC m=+0.500453117 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:09:41 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:09:41 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Feb 23 03:09:43 localhost ovs-vsctl[71159]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Feb 23 03:09:43 localhost systemd[1]: libpod-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83.scope: Deactivated successfully. Feb 23 03:09:43 localhost systemd[1]: libpod-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83.scope: Consumed 2.846s CPU time. Feb 23 03:09:43 localhost podman[71160]: 2026-02-23 08:09:43.699922602 +0000 UTC m=+0.039688499 container died 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=setup_ovs_manager, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4) Feb 23 03:09:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83-userdata-shm.mount: Deactivated successfully. Feb 23 03:09:43 localhost systemd[1]: var-lib-containers-storage-overlay-6c9a1aacc166b15a11e7e7c477a5bec4f993243887d4de4680eae8258483d960-merged.mount: Deactivated successfully. Feb 23 03:09:43 localhost podman[71160]: 2026-02-23 08:09:43.751555763 +0000 UTC m=+0.091321610 container cleanup 46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, release=1766032510, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 23 03:09:43 localhost systemd[1]: libpod-conmon-46cd96bfccaf34fdb6b427dbf0502b9947fc7c26e9225c58dea27a019553bc83.scope: Deactivated successfully. Feb 23 03:09:43 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771832380 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771832380'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Feb 23 03:09:44 localhost podman[71273]: 2026-02-23 08:09:44.214276561 +0000 UTC m=+0.067095415 container create 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:09:44 localhost podman[71274]: 2026-02-23 08:09:44.254597579 +0000 UTC m=+0.101569600 container create 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:09:44 localhost systemd[1]: Started libpod-conmon-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.scope. Feb 23 03:09:44 localhost systemd[1]: Started libcrun container. Feb 23 03:09:44 localhost podman[71273]: 2026-02-23 08:09:44.177172153 +0000 UTC m=+0.029991057 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 23 03:09:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c9543ebc437e655c5d4bfa732d117b93efd0b526ada85fb52dfe4d58e51e764/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c9543ebc437e655c5d4bfa732d117b93efd0b526ada85fb52dfe4d58e51e764/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4c9543ebc437e655c5d4bfa732d117b93efd0b526ada85fb52dfe4d58e51e764/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:44 localhost systemd[1]: Started libpod-conmon-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.scope. Feb 23 03:09:44 localhost podman[71274]: 2026-02-23 08:09:44.208057797 +0000 UTC m=+0.055029848 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 23 03:09:44 localhost systemd[1]: Started libcrun container. Feb 23 03:09:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ccd32925d3fd52f95d80d5d3005423a627f4aa5e2e72537587c6c2e01c55ed4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ccd32925d3fd52f95d80d5d3005423a627f4aa5e2e72537587c6c2e01c55ed4/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ccd32925d3fd52f95d80d5d3005423a627f4aa5e2e72537587c6c2e01c55ed4/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Feb 23 03:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:09:44 localhost podman[71273]: 2026-02-23 08:09:44.33410554 +0000 UTC m=+0.186924414 container init 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, vcs-type=git) Feb 23 03:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:09:44 localhost podman[71274]: 2026-02-23 08:09:44.356381425 +0000 UTC m=+0.203353446 container init 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent) Feb 23 03:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:09:44 localhost podman[71273]: 2026-02-23 08:09:44.382994585 +0000 UTC m=+0.235813459 container start 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Feb 23 03:09:44 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:09:44 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Feb 23 03:09:44 localhost systemd[1]: Created slice User Slice of UID 0. Feb 23 03:09:44 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 23 03:09:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:09:44 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 23 03:09:44 localhost podman[71274]: 2026-02-23 08:09:44.429992902 +0000 UTC m=+0.276964923 container start 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team) Feb 23 03:09:44 localhost systemd[1]: Starting User Manager for UID 0... Feb 23 03:09:44 localhost python3[70392]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=cf62475d9880911ecf982eff6ab572ad --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 23 03:09:44 localhost podman[71316]: 2026-02-23 08:09:44.487998842 +0000 UTC m=+0.098229966 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, vcs-type=git, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 23 03:09:44 localhost podman[71316]: 2026-02-23 08:09:44.571162596 +0000 UTC m=+0.181393680 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 23 03:09:44 localhost podman[71316]: unhealthy Feb 23 03:09:44 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:09:44 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:09:44 localhost systemd[71337]: Queued start job for default target Main User Target. Feb 23 03:09:44 localhost systemd[71337]: Created slice User Application Slice. Feb 23 03:09:44 localhost systemd[71337]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 23 03:09:44 localhost systemd[71337]: Started Daily Cleanup of User's Temporary Directories. Feb 23 03:09:44 localhost systemd[71337]: Reached target Paths. Feb 23 03:09:44 localhost systemd[71337]: Reached target Timers. Feb 23 03:09:44 localhost systemd[71337]: Starting D-Bus User Message Bus Socket... Feb 23 03:09:44 localhost systemd[71337]: Starting Create User's Volatile Files and Directories... Feb 23 03:09:44 localhost podman[71333]: 2026-02-23 08:09:44.620776615 +0000 UTC m=+0.180362969 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com) Feb 23 03:09:44 localhost systemd[71337]: Listening on D-Bus User Message Bus Socket. Feb 23 03:09:44 localhost systemd[71337]: Reached target Sockets. Feb 23 03:09:44 localhost systemd[71337]: Finished Create User's Volatile Files and Directories. Feb 23 03:09:44 localhost systemd[71337]: Reached target Basic System. Feb 23 03:09:44 localhost systemd[71337]: Reached target Main User Target. Feb 23 03:09:44 localhost systemd[71337]: Startup finished in 152ms. Feb 23 03:09:44 localhost systemd[1]: Started User Manager for UID 0. Feb 23 03:09:44 localhost systemd[1]: Started Session c9 of User root. Feb 23 03:09:44 localhost podman[71333]: 2026-02-23 08:09:44.663138396 +0000 UTC m=+0.222724760 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., config_id=tripleo_step4) Feb 23 03:09:44 localhost podman[71333]: unhealthy Feb 23 03:09:44 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:09:44 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:09:44 localhost systemd[1]: session-c9.scope: Deactivated successfully. Feb 23 03:09:44 localhost kernel: device br-int entered promiscuous mode Feb 23 03:09:44 localhost NetworkManager[5974]: [1771834184.8065] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Feb 23 03:09:44 localhost systemd-udevd[71428]: Network interface NamePolicy= disabled on kernel command line. Feb 23 03:09:45 localhost python3[71448]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:45 localhost python3[71464]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:45 localhost python3[71480]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:45 localhost kernel: device genev_sys_6081 entered promiscuous mode Feb 23 03:09:45 localhost NetworkManager[5974]: [1771834185.8497] device (genev_sys_6081): carrier: link connected Feb 23 03:09:45 localhost systemd-udevd[71430]: Network interface NamePolicy= disabled on kernel command line. Feb 23 03:09:45 localhost NetworkManager[5974]: [1771834185.8502] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Feb 23 03:09:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:09:45 localhost podman[71499]: 2026-02-23 08:09:45.990132922 +0000 UTC m=+0.093130097 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr) Feb 23 03:09:46 localhost python3[71500]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:46 localhost podman[71499]: 2026-02-23 08:09:46.173693019 +0000 UTC m=+0.276690154 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-qdrouterd-container) Feb 23 03:09:46 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:09:46 localhost python3[71543]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:46 localhost python3[71563]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:46 localhost python3[71579]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:47 localhost python3[71597]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:47 localhost python3[71615]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:47 localhost python3[71631]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:47 localhost python3[71647]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:48 localhost python3[71663]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:09:48 localhost python3[71724]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:49 localhost python3[71753]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:49 localhost python3[71782]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:50 localhost python3[71811]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:50 localhost python3[71840]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:51 localhost python3[71869]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834188.2694278-110042-138288306449896/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:09:51 localhost python3[71885]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 03:09:51 localhost systemd[1]: Reloading. Feb 23 03:09:51 localhost systemd-rc-local-generator[71908]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:51 localhost systemd-sysv-generator[71914]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:52 localhost python3[71938]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:09:52 localhost systemd[1]: Reloading. Feb 23 03:09:53 localhost systemd-rc-local-generator[71965]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:53 localhost systemd-sysv-generator[71970]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:53 localhost systemd[1]: Starting ceilometer_agent_compute container... Feb 23 03:09:53 localhost tripleo-start-podman-container[71978]: Creating additional drop-in dependency for "ceilometer_agent_compute" (68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9) Feb 23 03:09:53 localhost systemd[1]: Reloading. Feb 23 03:09:53 localhost systemd-rc-local-generator[72033]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:53 localhost systemd-sysv-generator[72037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:53 localhost systemd[1]: Started ceilometer_agent_compute container. Feb 23 03:09:54 localhost python3[72061]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:09:54 localhost systemd[1]: Stopping User Manager for UID 0... Feb 23 03:09:54 localhost systemd[71337]: Activating special unit Exit the Session... Feb 23 03:09:54 localhost systemd[71337]: Stopped target Main User Target. Feb 23 03:09:54 localhost systemd[71337]: Stopped target Basic System. Feb 23 03:09:54 localhost systemd[71337]: Stopped target Paths. Feb 23 03:09:54 localhost systemd[71337]: Stopped target Sockets. Feb 23 03:09:54 localhost systemd[71337]: Stopped target Timers. Feb 23 03:09:54 localhost systemd[71337]: Stopped Daily Cleanup of User's Temporary Directories. Feb 23 03:09:54 localhost systemd[71337]: Closed D-Bus User Message Bus Socket. Feb 23 03:09:54 localhost systemd[71337]: Stopped Create User's Volatile Files and Directories. Feb 23 03:09:54 localhost systemd[71337]: Removed slice User Application Slice. Feb 23 03:09:54 localhost systemd[71337]: Reached target Shutdown. Feb 23 03:09:54 localhost systemd[71337]: Finished Exit the Session. Feb 23 03:09:54 localhost systemd[71337]: Reached target Exit the Session. Feb 23 03:09:54 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 23 03:09:54 localhost systemd[1]: Stopped User Manager for UID 0. Feb 23 03:09:54 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 23 03:09:54 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 23 03:09:54 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 23 03:09:54 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 23 03:09:54 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 23 03:09:55 localhost systemd[1]: Reloading. Feb 23 03:09:55 localhost systemd-rc-local-generator[72091]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:55 localhost systemd-sysv-generator[72095]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:55 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:55 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Feb 23 03:09:56 localhost systemd[1]: Started ceilometer_agent_ipmi container. Feb 23 03:09:56 localhost python3[72130]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:09:56 localhost systemd[1]: Reloading. Feb 23 03:09:56 localhost systemd-rc-local-generator[72158]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:56 localhost systemd-sysv-generator[72161]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:57 localhost systemd[1]: Starting logrotate_crond container... Feb 23 03:09:57 localhost systemd[1]: Started logrotate_crond container. Feb 23 03:09:58 localhost python3[72197]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:09:58 localhost systemd[1]: Reloading. Feb 23 03:09:59 localhost systemd-rc-local-generator[72220]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:09:59 localhost systemd-sysv-generator[72224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:09:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:09:59 localhost systemd[1]: Starting nova_migration_target container... Feb 23 03:09:59 localhost systemd[1]: Started nova_migration_target container. Feb 23 03:10:00 localhost python3[72263]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:10:00 localhost systemd[1]: Reloading. Feb 23 03:10:00 localhost systemd-sysv-generator[72295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:10:00 localhost systemd-rc-local-generator[72289]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:10:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:10:00 localhost systemd[1]: Starting ovn_controller container... Feb 23 03:10:00 localhost tripleo-start-podman-container[72303]: Creating additional drop-in dependency for "ovn_controller" (1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e) Feb 23 03:10:00 localhost systemd[1]: Reloading. Feb 23 03:10:00 localhost systemd-sysv-generator[72363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:10:00 localhost systemd-rc-local-generator[72360]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:10:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:10:01 localhost systemd[1]: Started ovn_controller container. Feb 23 03:10:01 localhost sshd[72373]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:10:01 localhost python3[72392]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:10:01 localhost systemd[1]: Reloading. Feb 23 03:10:01 localhost systemd-rc-local-generator[72418]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:10:01 localhost systemd-sysv-generator[72422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:10:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:10:02 localhost systemd[1]: Starting ovn_metadata_agent container... Feb 23 03:10:02 localhost systemd[1]: Started ovn_metadata_agent container. Feb 23 03:10:02 localhost python3[72473]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:10:04 localhost python3[72595]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005626463 step=4 update_config_hash_only=False Feb 23 03:10:04 localhost python3[72611]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:10:05 localhost python3[72627]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 23 03:10:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:10:06 localhost podman[72628]: 2026-02-23 08:10:06.914907771 +0000 UTC m=+0.088795281 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public) Feb 23 03:10:06 localhost podman[72628]: 2026-02-23 08:10:06.927530916 +0000 UTC m=+0.101418436 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:10:06 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:10:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:10:09 localhost podman[72650]: 2026-02-23 08:10:09.895721269 +0000 UTC m=+0.070305935 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.13, container_name=iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z) Feb 23 03:10:09 localhost podman[72650]: 2026-02-23 08:10:09.909364345 +0000 UTC m=+0.083949071 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510) Feb 23 03:10:09 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:10:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:10:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:10:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:10:10 localhost systemd[1]: tmp-crun.wWNCkC.mount: Deactivated successfully. Feb 23 03:10:10 localhost podman[72671]: 2026-02-23 08:10:10.897575439 +0000 UTC m=+0.073719181 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1766032510, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 03:10:10 localhost podman[72673]: 2026-02-23 08:10:10.957262761 +0000 UTC m=+0.128020815 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, release=1766032510, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public) Feb 23 03:10:10 localhost podman[72672]: 2026-02-23 08:10:10.9213351 +0000 UTC m=+0.092024042 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., release=1766032510, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 23 03:10:10 localhost podman[72671]: 2026-02-23 08:10:10.983376086 +0000 UTC m=+0.159519809 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510) Feb 23 03:10:11 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:10:11 localhost podman[72672]: 2026-02-23 08:10:11.004399212 +0000 UTC m=+0.175088084 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z) Feb 23 03:10:11 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:10:11 localhost podman[72673]: 2026-02-23 08:10:11.041772368 +0000 UTC m=+0.212530372 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, container_name=logrotate_crond) Feb 23 03:10:11 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:10:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:10:11 localhost podman[72744]: 2026-02-23 08:10:11.886484755 +0000 UTC m=+0.064558025 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:10:12 localhost podman[72744]: 2026-02-23 08:10:12.262190158 +0000 UTC m=+0.440263488 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1) Feb 23 03:10:12 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:10:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:10:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:10:14 localhost systemd[1]: tmp-crun.JxDMPP.mount: Deactivated successfully. Feb 23 03:10:14 localhost podman[72766]: 2026-02-23 08:10:14.914844547 +0000 UTC m=+0.093262901 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4) Feb 23 03:10:14 localhost podman[72766]: 2026-02-23 08:10:14.938151694 +0000 UTC m=+0.116570138 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, release=1766032510, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4) Feb 23 03:10:14 localhost systemd[1]: tmp-crun.2m37Cg.mount: Deactivated successfully. Feb 23 03:10:14 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:10:14 localhost podman[72767]: 2026-02-23 08:10:14.961134932 +0000 UTC m=+0.137435140 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public) Feb 23 03:10:15 localhost podman[72767]: 2026-02-23 08:10:15.032260851 +0000 UTC m=+0.208560979 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:10:15 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:10:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:10:16 localhost podman[72811]: 2026-02-23 08:10:16.916318967 +0000 UTC m=+0.088163541 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible) Feb 23 03:10:17 localhost podman[72811]: 2026-02-23 08:10:17.131046248 +0000 UTC m=+0.302890882 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:10:17 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:10:20 localhost sshd[72841]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:10:21 localhost snmpd[67690]: empty variable list in _query Feb 23 03:10:21 localhost snmpd[67690]: empty variable list in _query Feb 23 03:10:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:10:37 localhost systemd[1]: tmp-crun.ZAAYEN.mount: Deactivated successfully. Feb 23 03:10:37 localhost podman[72920]: 2026-02-23 08:10:37.941289754 +0000 UTC m=+0.108054943 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Feb 23 03:10:37 localhost podman[72920]: 2026-02-23 08:10:37.957488449 +0000 UTC m=+0.124253688 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13) Feb 23 03:10:37 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:10:40 localhost sshd[72942]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:10:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:10:40 localhost podman[72944]: 2026-02-23 08:10:40.919451468 +0000 UTC m=+0.094274393 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:10:40 localhost podman[72944]: 2026-02-23 08:10:40.938076569 +0000 UTC m=+0.112899504 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:10:40 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:10:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:10:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:10:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:10:41 localhost podman[72965]: 2026-02-23 08:10:41.909208841 +0000 UTC m=+0.083218458 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true) Feb 23 03:10:41 localhost podman[72965]: 2026-02-23 08:10:41.942211531 +0000 UTC m=+0.116221188 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 23 03:10:41 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:10:41 localhost podman[72964]: 2026-02-23 08:10:41.965468136 +0000 UTC m=+0.141565357 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:10:42 localhost podman[72966]: 2026-02-23 08:10:42.010923914 +0000 UTC m=+0.180788711 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:10:42 localhost podman[72966]: 2026-02-23 08:10:42.019607765 +0000 UTC m=+0.189472532 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:10:42 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:10:42 localhost podman[72964]: 2026-02-23 08:10:42.069902135 +0000 UTC m=+0.245999306 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:10:42 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:10:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:10:42 localhost podman[73035]: 2026-02-23 08:10:42.913613881 +0000 UTC m=+0.087424920 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 23 03:10:43 localhost podman[73035]: 2026-02-23 08:10:43.331340965 +0000 UTC m=+0.505152044 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.buildah.version=1.41.5) Feb 23 03:10:43 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:10:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:10:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:10:45 localhost podman[73059]: 2026-02-23 08:10:45.912855003 +0000 UTC m=+0.091569808 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller) Feb 23 03:10:45 localhost podman[73060]: 2026-02-23 08:10:45.954926535 +0000 UTC m=+0.130925185 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 23 03:10:45 localhost podman[73059]: 2026-02-23 08:10:45.965391352 +0000 UTC m=+0.144106157 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T22:36:40Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510) Feb 23 03:10:45 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:10:46 localhost podman[73060]: 2026-02-23 08:10:46.022254786 +0000 UTC m=+0.198253366 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, tcib_managed=true, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, io.openshift.expose-services=) Feb 23 03:10:46 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:10:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:10:47 localhost systemd[1]: tmp-crun.ELZPGa.mount: Deactivated successfully. Feb 23 03:10:47 localhost podman[73107]: 2026-02-23 08:10:47.935603867 +0000 UTC m=+0.092982812 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:10:48 localhost podman[73107]: 2026-02-23 08:10:48.13533385 +0000 UTC m=+0.292712785 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1766032510, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 23 03:10:48 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:11:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:11:08 localhost podman[73137]: 2026-02-23 08:11:08.905444756 +0000 UTC m=+0.081004532 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, distribution-scope=public) Feb 23 03:11:08 localhost podman[73137]: 2026-02-23 08:11:08.919234284 +0000 UTC m=+0.094794060 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, container_name=collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, distribution-scope=public, batch=17.1_20260112.1) Feb 23 03:11:08 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:11:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:11:11 localhost systemd[1]: tmp-crun.MANFgF.mount: Deactivated successfully. Feb 23 03:11:11 localhost podman[73157]: 2026-02-23 08:11:11.9425808 +0000 UTC m=+0.077789450 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1) Feb 23 03:11:11 localhost podman[73157]: 2026-02-23 08:11:11.975501215 +0000 UTC m=+0.110709825 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:11:11 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:11:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:11:12 localhost podman[73176]: 2026-02-23 08:11:12.091402333 +0000 UTC m=+0.084253625 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:11:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:11:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:11:12 localhost podman[73176]: 2026-02-23 08:11:12.12724265 +0000 UTC m=+0.120093962 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public) Feb 23 03:11:12 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:11:12 localhost podman[73198]: 2026-02-23 08:11:12.202067255 +0000 UTC m=+0.084573675 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron) Feb 23 03:11:12 localhost podman[73198]: 2026-02-23 08:11:12.213577641 +0000 UTC m=+0.096084091 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 23 03:11:12 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:11:12 localhost podman[73196]: 2026-02-23 08:11:12.295445528 +0000 UTC m=+0.179399094 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4) Feb 23 03:11:12 localhost podman[73196]: 2026-02-23 08:11:12.324782669 +0000 UTC m=+0.208736205 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:11:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:11:12 localhost systemd[1]: tmp-crun.bOSQiJ.mount: Deactivated successfully. Feb 23 03:11:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:11:13 localhost podman[73250]: 2026-02-23 08:11:13.883141494 +0000 UTC m=+0.066522482 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container) Feb 23 03:11:14 localhost podman[73250]: 2026-02-23 08:11:14.244644286 +0000 UTC m=+0.428025294 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:11:14 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:11:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:11:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:11:16 localhost podman[73274]: 2026-02-23 08:11:16.940962043 +0000 UTC m=+0.110380204 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, release=1766032510) Feb 23 03:11:16 localhost podman[73274]: 2026-02-23 08:11:16.986695484 +0000 UTC m=+0.156113625 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Feb 23 03:11:16 localhost systemd[1]: tmp-crun.up6nSo.mount: Deactivated successfully. Feb 23 03:11:17 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:11:17 localhost podman[73275]: 2026-02-23 08:11:17.005280414 +0000 UTC m=+0.174173518 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 23 03:11:17 localhost podman[73275]: 2026-02-23 08:11:17.049212139 +0000 UTC m=+0.218105293 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 23 03:11:17 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:11:17 localhost sshd[73322]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:11:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:11:18 localhost podman[73323]: 2026-02-23 08:11:18.911101715 +0000 UTC m=+0.084576455 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510) Feb 23 03:11:19 localhost podman[73323]: 2026-02-23 08:11:19.112060443 +0000 UTC m=+0.285535143 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1) Feb 23 03:11:19 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:11:19 localhost sshd[73352]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:11:26 localhost podman[73456]: 2026-02-23 08:11:26.738980953 +0000 UTC m=+0.094620393 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, ceph=True, CEPH_POINT_RELEASE=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, vcs-type=git, release=1770267347, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph) Feb 23 03:11:26 localhost podman[73456]: 2026-02-23 08:11:26.848398666 +0000 UTC m=+0.204038096 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True) Feb 23 03:11:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:11:39 localhost systemd[1]: tmp-crun.0vd7f6.mount: Deactivated successfully. Feb 23 03:11:39 localhost podman[73602]: 2026-02-23 08:11:39.935841287 +0000 UTC m=+0.109658161 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:11:39 localhost podman[73602]: 2026-02-23 08:11:39.948286732 +0000 UTC m=+0.122103636 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 23 03:11:39 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:11:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:11:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:11:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:11:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:11:42 localhost podman[73623]: 2026-02-23 08:11:42.921100374 +0000 UTC m=+0.092548118 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team) Feb 23 03:11:42 localhost podman[73623]: 2026-02-23 08:11:42.954215335 +0000 UTC m=+0.125663049 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1766032510, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 23 03:11:42 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:11:43 localhost podman[73622]: 2026-02-23 08:11:43.03064722 +0000 UTC m=+0.202897639 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:11:43 localhost podman[73625]: 2026-02-23 08:11:43.087081531 +0000 UTC m=+0.249971103 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-cron-container) Feb 23 03:11:43 localhost podman[73625]: 2026-02-23 08:11:43.095034494 +0000 UTC m=+0.257924086 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:11:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:11:43 localhost podman[73622]: 2026-02-23 08:11:43.115644248 +0000 UTC m=+0.287894677 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:11:43 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:11:43 localhost podman[73624]: 2026-02-23 08:11:43.133920758 +0000 UTC m=+0.300837558 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible) Feb 23 03:11:43 localhost podman[73624]: 2026-02-23 08:11:43.168222646 +0000 UTC m=+0.335139436 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5) Feb 23 03:11:43 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:11:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:11:44 localhost podman[73715]: 2026-02-23 08:11:44.907635546 +0000 UTC m=+0.082808878 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 03:11:45 localhost podman[73715]: 2026-02-23 08:11:45.230260315 +0000 UTC m=+0.405433677 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 03:11:45 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:11:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:11:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:11:47 localhost podman[73739]: 2026-02-23 08:11:47.911449145 +0000 UTC m=+0.084899896 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 23 03:11:47 localhost podman[73739]: 2026-02-23 08:11:47.966490021 +0000 UTC m=+0.139940742 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:11:47 localhost systemd[1]: tmp-crun.cq10J5.mount: Deactivated successfully. Feb 23 03:11:47 localhost podman[73740]: 2026-02-23 08:11:47.978865695 +0000 UTC m=+0.147174123 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:11:47 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:11:48 localhost podman[73740]: 2026-02-23 08:11:48.051790019 +0000 UTC m=+0.220098377 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, container_name=ovn_metadata_agent, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:11:48 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:11:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:11:49 localhost podman[73789]: 2026-02-23 08:11:49.921227655 +0000 UTC m=+0.095774510 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, release=1766032510, com.redhat.component=openstack-qdrouterd-container) Feb 23 03:11:50 localhost podman[73789]: 2026-02-23 08:11:50.118800895 +0000 UTC m=+0.293347740 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=metrics_qdr) Feb 23 03:11:50 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:11:56 localhost sshd[73816]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:12:00 localhost sshd[73818]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:12:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:12:10 localhost podman[73820]: 2026-02-23 08:12:10.927060103 +0000 UTC m=+0.099700487 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, container_name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 23 03:12:10 localhost podman[73820]: 2026-02-23 08:12:10.934931803 +0000 UTC m=+0.107572167 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, container_name=collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-type=git, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510) Feb 23 03:12:10 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:12:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:12:13 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:12:13 localhost recover_tripleo_nova_virtqemud[73865]: 61982 Feb 23 03:12:13 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:12:13 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:12:13 localhost podman[73840]: 2026-02-23 08:12:13.927526218 +0000 UTC m=+0.098534199 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:12:13 localhost podman[73840]: 2026-02-23 08:12:13.968284083 +0000 UTC m=+0.139292044 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, release=1766032510, distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:12:13 localhost podman[73841]: 2026-02-23 08:12:13.976126392 +0000 UTC m=+0.146010077 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, container_name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:12:13 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:12:13 localhost podman[73843]: 2026-02-23 08:12:13.996752307 +0000 UTC m=+0.157806902 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 23 03:12:14 localhost podman[73842]: 2026-02-23 08:12:14.033501314 +0000 UTC m=+0.198151754 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 23 03:12:14 localhost podman[73843]: 2026-02-23 08:12:14.038214914 +0000 UTC m=+0.199269529 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=logrotate_crond, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:12:14 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:12:14 localhost podman[73841]: 2026-02-23 08:12:14.062191954 +0000 UTC m=+0.232075669 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git) Feb 23 03:12:14 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:12:14 localhost podman[73842]: 2026-02-23 08:12:14.08723934 +0000 UTC m=+0.251889750 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:12:14 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:12:14 localhost systemd[1]: tmp-crun.moivRN.mount: Deactivated successfully. Feb 23 03:12:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:12:15 localhost systemd[1]: tmp-crun.Dp0eP3.mount: Deactivated successfully. Feb 23 03:12:15 localhost podman[73932]: 2026-02-23 08:12:15.919080276 +0000 UTC m=+0.091297569 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Feb 23 03:12:16 localhost podman[73932]: 2026-02-23 08:12:16.304448083 +0000 UTC m=+0.476665406 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team) Feb 23 03:12:16 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:12:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:12:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:12:18 localhost podman[73957]: 2026-02-23 08:12:18.904579356 +0000 UTC m=+0.075610861 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 23 03:12:18 localhost podman[73957]: 2026-02-23 08:12:18.951020282 +0000 UTC m=+0.122051757 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Feb 23 03:12:18 localhost systemd[1]: tmp-crun.YN1gty.mount: Deactivated successfully. Feb 23 03:12:18 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:12:18 localhost podman[73956]: 2026-02-23 08:12:18.974946441 +0000 UTC m=+0.148154116 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., container_name=ovn_controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public) Feb 23 03:12:19 localhost podman[73956]: 2026-02-23 08:12:19.027290993 +0000 UTC m=+0.200498638 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, container_name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible) Feb 23 03:12:19 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:12:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:12:20 localhost podman[74002]: 2026-02-23 08:12:20.91516557 +0000 UTC m=+0.090066581 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, container_name=metrics_qdr, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Feb 23 03:12:21 localhost podman[74002]: 2026-02-23 08:12:21.109951204 +0000 UTC m=+0.284852245 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z) Feb 23 03:12:21 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:12:36 localhost sshd[74109]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:12:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:12:41 localhost systemd[1]: tmp-crun.56vU8m.mount: Deactivated successfully. Feb 23 03:12:41 localhost podman[74111]: 2026-02-23 08:12:41.933147823 +0000 UTC m=+0.102226126 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3) Feb 23 03:12:41 localhost podman[74111]: 2026-02-23 08:12:41.950203755 +0000 UTC m=+0.119282068 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 23 03:12:41 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:12:43 localhost python3[74180]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:12:44 localhost python3[74225]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834363.3614283-114199-27635855336001/source _original_basename=tmpyj21v48e follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:12:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:12:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:12:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:12:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:12:44 localhost systemd[1]: tmp-crun.AaxDDS.mount: Deactivated successfully. Feb 23 03:12:44 localhost podman[74241]: 2026-02-23 08:12:44.945362212 +0000 UTC m=+0.110871372 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=) Feb 23 03:12:44 localhost podman[74243]: 2026-02-23 08:12:44.978945158 +0000 UTC m=+0.137517837 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, container_name=logrotate_crond, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, vcs-type=git, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Feb 23 03:12:44 localhost podman[74241]: 2026-02-23 08:12:44.983243174 +0000 UTC m=+0.148752294 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Feb 23 03:12:44 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:12:45 localhost podman[74240]: 2026-02-23 08:12:45.045067977 +0000 UTC m=+0.213531341 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, build-date=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:12:45 localhost python3[74302]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:12:45 localhost podman[74240]: 2026-02-23 08:12:45.085983887 +0000 UTC m=+0.254447201 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container) Feb 23 03:12:45 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:12:45 localhost podman[74242]: 2026-02-23 08:12:45.102577304 +0000 UTC m=+0.263622082 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public) Feb 23 03:12:45 localhost podman[74243]: 2026-02-23 08:12:45.117536289 +0000 UTC m=+0.276109028 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 23 03:12:45 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:12:45 localhost podman[74242]: 2026-02-23 08:12:45.142427979 +0000 UTC m=+0.303472747 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, release=1766032510, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-type=git, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc.) Feb 23 03:12:45 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:12:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:12:46 localhost systemd[1]: tmp-crun.usEInn.mount: Deactivated successfully. Feb 23 03:12:46 localhost podman[74517]: 2026-02-23 08:12:46.722168492 +0000 UTC m=+0.088954406 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:12:46 localhost ansible-async_wrapper.py[74516]: Invoked with 305274563872 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834366.184046-114335-11440511828170/AnsiballZ_command.py _ Feb 23 03:12:46 localhost ansible-async_wrapper.py[74539]: Starting module and watcher Feb 23 03:12:46 localhost ansible-async_wrapper.py[74539]: Start watching 74540 (3600) Feb 23 03:12:46 localhost ansible-async_wrapper.py[74540]: Start module (74540) Feb 23 03:12:46 localhost ansible-async_wrapper.py[74516]: Return async_wrapper task started. Feb 23 03:12:47 localhost podman[74517]: 2026-02-23 08:12:47.107216638 +0000 UTC m=+0.474002512 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:12:47 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:12:47 localhost python3[74561]: ansible-ansible.legacy.async_status Invoked with jid=305274563872.74516 mode=status _async_dir=/tmp/.ansible_async Feb 23 03:12:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:12:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:12:49 localhost podman[74616]: 2026-02-23 08:12:49.691475948 +0000 UTC m=+0.093619124 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, distribution-scope=public, container_name=ovn_controller, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:12:49 localhost podman[74616]: 2026-02-23 08:12:49.724249989 +0000 UTC m=+0.126393175 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 23 03:12:49 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:12:49 localhost podman[74617]: 2026-02-23 08:12:49.748277762 +0000 UTC m=+0.148584229 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 23 03:12:49 localhost podman[74617]: 2026-02-23 08:12:49.798531657 +0000 UTC m=+0.198838134 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:12:49 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:12:50 localhost puppet-user[74559]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Feb 23 03:12:50 localhost puppet-user[74559]: (file: /etc/puppet/hiera.yaml) Feb 23 03:12:50 localhost puppet-user[74559]: Warning: Undefined variable '::deploy_config_name'; Feb 23 03:12:50 localhost puppet-user[74559]: (file & line not available) Feb 23 03:12:50 localhost puppet-user[74559]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Feb 23 03:12:50 localhost puppet-user[74559]: (file & line not available) Feb 23 03:12:50 localhost puppet-user[74559]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Feb 23 03:12:50 localhost puppet-user[74559]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:12:50 localhost puppet-user[74559]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:12:50 localhost puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:12:50 localhost puppet-user[74559]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:12:50 localhost puppet-user[74559]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:12:50 localhost puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:12:50 localhost puppet-user[74559]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:12:50 localhost puppet-user[74559]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:12:50 localhost puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:12:50 localhost puppet-user[74559]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:12:50 localhost puppet-user[74559]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:12:50 localhost puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:12:50 localhost puppet-user[74559]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:12:50 localhost puppet-user[74559]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:12:50 localhost puppet-user[74559]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Feb 23 03:12:50 localhost puppet-user[74559]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Feb 23 03:12:50 localhost puppet-user[74559]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Feb 23 03:12:50 localhost puppet-user[74559]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Feb 23 03:12:50 localhost puppet-user[74559]: Notice: Compiled catalog for np0005626463.localdomain in environment production in 0.21 seconds Feb 23 03:12:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:12:51 localhost puppet-user[74559]: Notice: Applied catalog in 0.35 seconds Feb 23 03:12:51 localhost puppet-user[74559]: Application: Feb 23 03:12:51 localhost puppet-user[74559]: Initial environment: production Feb 23 03:12:51 localhost puppet-user[74559]: Converged environment: production Feb 23 03:12:51 localhost puppet-user[74559]: Run mode: user Feb 23 03:12:51 localhost puppet-user[74559]: Changes: Feb 23 03:12:51 localhost puppet-user[74559]: Events: Feb 23 03:12:51 localhost puppet-user[74559]: Resources: Feb 23 03:12:51 localhost puppet-user[74559]: Total: 19 Feb 23 03:12:51 localhost puppet-user[74559]: Time: Feb 23 03:12:51 localhost puppet-user[74559]: Package: 0.00 Feb 23 03:12:51 localhost puppet-user[74559]: Schedule: 0.00 Feb 23 03:12:51 localhost puppet-user[74559]: Augeas: 0.01 Feb 23 03:12:51 localhost puppet-user[74559]: Exec: 0.01 Feb 23 03:12:51 localhost puppet-user[74559]: File: 0.02 Feb 23 03:12:51 localhost puppet-user[74559]: Service: 0.09 Feb 23 03:12:51 localhost puppet-user[74559]: Config retrieval: 0.28 Feb 23 03:12:51 localhost puppet-user[74559]: Transaction evaluation: 0.34 Feb 23 03:12:51 localhost puppet-user[74559]: Catalog application: 0.35 Feb 23 03:12:51 localhost puppet-user[74559]: Last run: 1771834371 Feb 23 03:12:51 localhost puppet-user[74559]: Filebucket: 0.00 Feb 23 03:12:51 localhost puppet-user[74559]: Total: 0.35 Feb 23 03:12:51 localhost puppet-user[74559]: Version: Feb 23 03:12:51 localhost puppet-user[74559]: Config: 1771834370 Feb 23 03:12:51 localhost puppet-user[74559]: Puppet: 7.10.0 Feb 23 03:12:51 localhost systemd[1]: tmp-crun.vMBbBH.mount: Deactivated successfully. Feb 23 03:12:51 localhost podman[74731]: 2026-02-23 08:12:51.256255614 +0000 UTC m=+0.094544202 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:12:51 localhost ansible-async_wrapper.py[74540]: Module complete (74540) Feb 23 03:12:51 localhost podman[74731]: 2026-02-23 08:12:51.469405413 +0000 UTC m=+0.307694031 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:12:51 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:12:51 localhost ansible-async_wrapper.py[74539]: Done in kid B. Feb 23 03:12:53 localhost sshd[74760]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:12:57 localhost python3[74777]: ansible-ansible.legacy.async_status Invoked with jid=305274563872.74516 mode=status _async_dir=/tmp/.ansible_async Feb 23 03:12:58 localhost python3[74793]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 03:12:58 localhost python3[74809]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:12:59 localhost python3[74859]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:12:59 localhost python3[74877]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpgcev01x7 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Feb 23 03:12:59 localhost python3[74907]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:13:01 localhost python3[75012]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Feb 23 03:13:01 localhost python3[75031]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:13:02 localhost python3[75063]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:13:03 localhost python3[75113]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:13:03 localhost python3[75131]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:13:04 localhost python3[75193]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:13:04 localhost python3[75211]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:13:05 localhost python3[75273]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:13:05 localhost python3[75291]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:13:05 localhost python3[75353]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:13:06 localhost python3[75371]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:13:06 localhost python3[75401]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:13:06 localhost systemd[1]: Reloading. Feb 23 03:13:06 localhost systemd-rc-local-generator[75425]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:13:06 localhost systemd-sysv-generator[75431]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:13:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:13:08 localhost python3[75487]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:13:08 localhost python3[75505]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:13:09 localhost python3[75567]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Feb 23 03:13:09 localhost python3[75585]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:13:10 localhost python3[75615]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:13:10 localhost systemd[1]: Reloading. Feb 23 03:13:10 localhost systemd-sysv-generator[75646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:13:10 localhost systemd-rc-local-generator[75640]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:13:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:13:10 localhost systemd[1]: Starting Create netns directory... Feb 23 03:13:10 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 03:13:10 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 03:13:10 localhost systemd[1]: Finished Create netns directory. Feb 23 03:13:11 localhost python3[75672]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Feb 23 03:13:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:13:12 localhost systemd[1]: tmp-crun.OEajZb.mount: Deactivated successfully. Feb 23 03:13:12 localhost podman[75715]: 2026-02-23 08:13:12.940957638 +0000 UTC m=+0.111417055 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, distribution-scope=public) Feb 23 03:13:12 localhost podman[75715]: 2026-02-23 08:13:12.983482815 +0000 UTC m=+0.153942192 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=) Feb 23 03:13:12 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:13:13 localhost python3[75749]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Feb 23 03:13:14 localhost podman[75790]: 2026-02-23 08:13:14.112424721 +0000 UTC m=+0.099928898 container create c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git) Feb 23 03:13:14 localhost podman[75790]: 2026-02-23 08:13:14.061576235 +0000 UTC m=+0.049080412 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 03:13:14 localhost systemd[1]: Started libpod-conmon-c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.scope. Feb 23 03:13:14 localhost systemd[1]: Started libcrun container. Feb 23 03:13:14 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:13:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 23 03:13:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:13:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 03:13:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:13:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:13:14 localhost recover_tripleo_nova_virtqemud[75808]: 61982 Feb 23 03:13:14 localhost podman[75790]: 2026-02-23 08:13:14.2330997 +0000 UTC m=+0.220603867 container init c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64) Feb 23 03:13:14 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:13:14 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:13:14 localhost systemd[1]: tmp-crun.u0L254.mount: Deactivated successfully. Feb 23 03:13:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:13:14 localhost podman[75790]: 2026-02-23 08:13:14.283246755 +0000 UTC m=+0.270750922 container start c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:13:14 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:13:14 localhost python3[75749]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 03:13:14 localhost systemd[1]: Created slice User Slice of UID 0. Feb 23 03:13:14 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 23 03:13:14 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 23 03:13:14 localhost systemd[1]: Starting User Manager for UID 0... Feb 23 03:13:14 localhost podman[75813]: 2026-02-23 08:13:14.388097894 +0000 UTC m=+0.094658364 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5) Feb 23 03:13:14 localhost podman[75813]: 2026-02-23 08:13:14.446251516 +0000 UTC m=+0.152812016 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5) Feb 23 03:13:14 localhost podman[75813]: unhealthy Feb 23 03:13:14 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:13:14 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:13:14 localhost systemd[75831]: Queued start job for default target Main User Target. Feb 23 03:13:14 localhost systemd[75831]: Created slice User Application Slice. Feb 23 03:13:14 localhost systemd[75831]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 23 03:13:14 localhost systemd[75831]: Started Daily Cleanup of User's Temporary Directories. Feb 23 03:13:14 localhost systemd[75831]: Reached target Paths. Feb 23 03:13:14 localhost systemd[75831]: Reached target Timers. Feb 23 03:13:14 localhost systemd[75831]: Starting D-Bus User Message Bus Socket... Feb 23 03:13:14 localhost systemd[75831]: Starting Create User's Volatile Files and Directories... Feb 23 03:13:14 localhost systemd[75831]: Listening on D-Bus User Message Bus Socket. Feb 23 03:13:14 localhost systemd[75831]: Reached target Sockets. Feb 23 03:13:14 localhost systemd[75831]: Finished Create User's Volatile Files and Directories. Feb 23 03:13:14 localhost systemd[75831]: Reached target Basic System. Feb 23 03:13:14 localhost systemd[75831]: Reached target Main User Target. Feb 23 03:13:14 localhost systemd[75831]: Startup finished in 161ms. Feb 23 03:13:14 localhost systemd[1]: Started User Manager for UID 0. Feb 23 03:13:14 localhost systemd[1]: Started Session c10 of User root. Feb 23 03:13:14 localhost systemd[1]: session-c10.scope: Deactivated successfully. Feb 23 03:13:14 localhost podman[75914]: 2026-02-23 08:13:14.792671682 +0000 UTC m=+0.081004562 container create e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_wait_for_compute_service, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:13:14 localhost systemd[1]: Started libpod-conmon-e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a.scope. Feb 23 03:13:14 localhost podman[75914]: 2026-02-23 08:13:14.742333642 +0000 UTC m=+0.030666502 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 03:13:14 localhost systemd[1]: Started libcrun container. Feb 23 03:13:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Feb 23 03:13:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Feb 23 03:13:14 localhost podman[75914]: 2026-02-23 08:13:14.870216815 +0000 UTC m=+0.158549635 container init e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_wait_for_compute_service, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:13:14 localhost podman[75914]: 2026-02-23 08:13:14.881080062 +0000 UTC m=+0.169412902 container start e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_wait_for_compute_service, vcs-type=git) Feb 23 03:13:14 localhost podman[75914]: 2026-02-23 08:13:14.881501485 +0000 UTC m=+0.169834325 container attach e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:13:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:13:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:13:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:13:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:13:15 localhost podman[75938]: 2026-02-23 08:13:15.174169224 +0000 UTC m=+0.099154753 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:13:15 localhost podman[75938]: 2026-02-23 08:13:15.204292958 +0000 UTC m=+0.129278497 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13) Feb 23 03:13:15 localhost podman[75950]: 2026-02-23 08:13:15.241876672 +0000 UTC m=+0.090363831 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=logrotate_crond, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, distribution-scope=public) Feb 23 03:13:15 localhost podman[75972]: 2026-02-23 08:13:15.253034378 +0000 UTC m=+0.069770133 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 23 03:13:15 localhost podman[75972]: 2026-02-23 08:13:15.278420254 +0000 UTC m=+0.095156009 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 23 03:13:15 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:13:15 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:13:15 localhost podman[75950]: 2026-02-23 08:13:15.333513702 +0000 UTC m=+0.182000801 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:13:15 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:13:15 localhost podman[75949]: 2026-02-23 08:13:15.283998977 +0000 UTC m=+0.137398099 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3, distribution-scope=public, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:13:15 localhost podman[75949]: 2026-02-23 08:13:15.416532955 +0000 UTC m=+0.269932067 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:13:15 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:13:16 localhost sshd[76030]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:13:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:13:17 localhost podman[76032]: 2026-02-23 08:13:17.902775212 +0000 UTC m=+0.079017840 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:13:18 localhost podman[76032]: 2026-02-23 08:13:18.28312774 +0000 UTC m=+0.459370308 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1) Feb 23 03:13:18 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:13:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:13:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:13:19 localhost systemd[1]: tmp-crun.dgOSWX.mount: Deactivated successfully. Feb 23 03:13:19 localhost podman[76054]: 2026-02-23 08:13:19.929085088 +0000 UTC m=+0.101773325 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, release=1766032510, vcs-type=git, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Feb 23 03:13:19 localhost podman[76055]: 2026-02-23 08:13:19.974803295 +0000 UTC m=+0.144761868 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 23 03:13:20 localhost podman[76054]: 2026-02-23 08:13:20.010180782 +0000 UTC m=+0.182869019 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:13:20 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:13:20 localhost podman[76055]: 2026-02-23 08:13:20.04918768 +0000 UTC m=+0.219146283 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:13:20 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:13:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:13:21 localhost podman[76102]: 2026-02-23 08:13:21.918949904 +0000 UTC m=+0.089647789 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=metrics_qdr, version=17.1.13) Feb 23 03:13:22 localhost podman[76102]: 2026-02-23 08:13:22.153510493 +0000 UTC m=+0.324208318 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:13:22 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:13:24 localhost systemd[1]: Stopping User Manager for UID 0... Feb 23 03:13:24 localhost systemd[75831]: Activating special unit Exit the Session... Feb 23 03:13:24 localhost systemd[75831]: Stopped target Main User Target. Feb 23 03:13:24 localhost systemd[75831]: Stopped target Basic System. Feb 23 03:13:24 localhost systemd[75831]: Stopped target Paths. Feb 23 03:13:24 localhost systemd[75831]: Stopped target Sockets. Feb 23 03:13:24 localhost systemd[75831]: Stopped target Timers. Feb 23 03:13:24 localhost systemd[75831]: Stopped Daily Cleanup of User's Temporary Directories. Feb 23 03:13:24 localhost systemd[75831]: Closed D-Bus User Message Bus Socket. Feb 23 03:13:24 localhost systemd[75831]: Stopped Create User's Volatile Files and Directories. Feb 23 03:13:24 localhost systemd[75831]: Removed slice User Application Slice. Feb 23 03:13:24 localhost systemd[75831]: Reached target Shutdown. Feb 23 03:13:24 localhost systemd[75831]: Finished Exit the Session. Feb 23 03:13:24 localhost systemd[75831]: Reached target Exit the Session. Feb 23 03:13:24 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 23 03:13:24 localhost systemd[1]: Stopped User Manager for UID 0. Feb 23 03:13:24 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 23 03:13:24 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 23 03:13:24 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 23 03:13:24 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 23 03:13:24 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 23 03:13:42 localhost sshd[76210]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:13:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:13:43 localhost podman[76212]: 2026-02-23 08:13:43.482460649 +0000 UTC m=+0.094079757 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, container_name=collectd, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 23 03:13:43 localhost podman[76212]: 2026-02-23 08:13:43.497249988 +0000 UTC m=+0.108869146 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-collectd) Feb 23 03:13:43 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:13:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:13:44 localhost podman[76233]: 2026-02-23 08:13:44.894897909 +0000 UTC m=+0.071980090 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:13:44 localhost podman[76233]: 2026-02-23 08:13:44.95684316 +0000 UTC m=+0.133925341 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, container_name=nova_compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5) Feb 23 03:13:44 localhost podman[76233]: unhealthy Feb 23 03:13:44 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:13:44 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:13:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:13:45 localhost systemd[1]: tmp-crun.SUgLPN.mount: Deactivated successfully. Feb 23 03:13:45 localhost podman[76256]: 2026-02-23 08:13:45.916916891 +0000 UTC m=+0.090367791 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, container_name=iscsid, release=1766032510) Feb 23 03:13:45 localhost podman[76256]: 2026-02-23 08:13:45.930291986 +0000 UTC m=+0.103742906 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:13:45 localhost podman[76259]: 2026-02-23 08:13:45.974716143 +0000 UTC m=+0.140479535 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:13:45 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:13:46 localhost podman[76258]: 2026-02-23 08:13:46.074207476 +0000 UTC m=+0.242362082 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64) Feb 23 03:13:46 localhost podman[76259]: 2026-02-23 08:13:46.089892482 +0000 UTC m=+0.255655914 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, container_name=logrotate_crond, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:13:46 localhost podman[76257]: 2026-02-23 08:13:46.039226442 +0000 UTC m=+0.208173522 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z) Feb 23 03:13:46 localhost podman[76257]: 2026-02-23 08:13:46.123345279 +0000 UTC m=+0.292292349 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 23 03:13:46 localhost podman[76258]: 2026-02-23 08:13:46.131584315 +0000 UTC m=+0.299738851 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:13:46 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:13:46 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:13:46 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:13:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:13:48 localhost podman[76350]: 2026-02-23 08:13:48.905336593 +0000 UTC m=+0.082415565 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, version=17.1.13, container_name=nova_migration_target, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:13:49 localhost podman[76350]: 2026-02-23 08:13:49.294381029 +0000 UTC m=+0.471459971 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:13:49 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:13:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:13:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:13:50 localhost podman[76372]: 2026-02-23 08:13:50.90395137 +0000 UTC m=+0.079584888 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, distribution-scope=public, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 23 03:13:50 localhost podman[76372]: 2026-02-23 08:13:50.956125666 +0000 UTC m=+0.131759154 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 03:13:50 localhost podman[76373]: 2026-02-23 08:13:50.969386467 +0000 UTC m=+0.139092892 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:13:50 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:13:51 localhost podman[76373]: 2026-02-23 08:13:51.020919594 +0000 UTC m=+0.190626029 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:13:51 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:13:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:13:52 localhost podman[76418]: 2026-02-23 08:13:52.92076705 +0000 UTC m=+0.092253429 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, vendor=Red Hat, Inc.) Feb 23 03:13:53 localhost podman[76418]: 2026-02-23 08:13:53.136410503 +0000 UTC m=+0.307896882 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:13:53 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:13:57 localhost sshd[76448]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:14:01 localhost systemd[1]: session-27.scope: Deactivated successfully. Feb 23 03:14:01 localhost systemd[1]: session-27.scope: Consumed 3.078s CPU time. Feb 23 03:14:01 localhost systemd-logind[759]: Session 27 logged out. Waiting for processes to exit. Feb 23 03:14:01 localhost systemd-logind[759]: Removed session 27. Feb 23 03:14:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:14:13 localhost podman[76450]: 2026-02-23 08:14:13.907922954 +0000 UTC m=+0.082668538 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, release=1766032510, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd) Feb 23 03:14:13 localhost podman[76450]: 2026-02-23 08:14:13.924577969 +0000 UTC m=+0.099323523 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:14:13 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:14:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:14:15 localhost podman[76470]: 2026-02-23 08:14:15.911006433 +0000 UTC m=+0.083731401 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 23 03:14:15 localhost podman[76470]: 2026-02-23 08:14:15.978373696 +0000 UTC m=+0.151098634 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_compute, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:14:15 localhost podman[76470]: unhealthy Feb 23 03:14:15 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:14:15 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:14:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:14:16 localhost systemd[1]: tmp-crun.7wOPKA.mount: Deactivated successfully. Feb 23 03:14:16 localhost podman[76496]: 2026-02-23 08:14:16.984168835 +0000 UTC m=+0.150108034 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:14:17 localhost podman[76496]: 2026-02-23 08:14:17.017410683 +0000 UTC m=+0.183349842 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, distribution-scope=public) Feb 23 03:14:17 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:14:17 localhost podman[76493]: 2026-02-23 08:14:17.057674454 +0000 UTC m=+0.231654857 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, version=17.1.13, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:14:17 localhost podman[76494]: 2026-02-23 08:14:17.022204319 +0000 UTC m=+0.191347435 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:14:17 localhost podman[76493]: 2026-02-23 08:14:17.073505575 +0000 UTC m=+0.247486038 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:14:17 localhost podman[76495]: 2026-02-23 08:14:16.935102287 +0000 UTC m=+0.102491580 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:14:17 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:14:17 localhost podman[76495]: 2026-02-23 08:14:17.114783507 +0000 UTC m=+0.282172850 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 23 03:14:17 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:14:17 localhost podman[76494]: 2026-02-23 08:14:17.158526403 +0000 UTC m=+0.327669519 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13) Feb 23 03:14:17 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:14:17 localhost systemd[1]: tmp-crun.bZvGKK.mount: Deactivated successfully. Feb 23 03:14:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:14:19 localhost systemd[1]: tmp-crun.DIOh5Z.mount: Deactivated successfully. Feb 23 03:14:19 localhost podman[76586]: 2026-02-23 08:14:19.911016604 +0000 UTC m=+0.085589198 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:14:20 localhost podman[76586]: 2026-02-23 08:14:20.256771361 +0000 UTC m=+0.431343915 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:14:20 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:14:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:14:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:14:21 localhost podman[76611]: 2026-02-23 08:14:21.911798753 +0000 UTC m=+0.082477752 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:14:21 localhost podman[76610]: 2026-02-23 08:14:21.962106149 +0000 UTC m=+0.134472129 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 23 03:14:21 localhost podman[76611]: 2026-02-23 08:14:21.982035144 +0000 UTC m=+0.152714143 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.) Feb 23 03:14:21 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:14:22 localhost podman[76610]: 2026-02-23 08:14:22.013271971 +0000 UTC m=+0.185637961 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:14:22 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:14:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:14:23 localhost podman[76654]: 2026-02-23 08:14:23.908028105 +0000 UTC m=+0.083355189 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, url=https://www.redhat.com) Feb 23 03:14:24 localhost podman[76654]: 2026-02-23 08:14:24.111319731 +0000 UTC m=+0.286646805 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=metrics_qdr, tcib_managed=true, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com) Feb 23 03:14:24 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:14:32 localhost sshd[76696]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:14:41 localhost sshd[76763]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:14:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:14:44 localhost podman[76765]: 2026-02-23 08:14:44.919274202 +0000 UTC m=+0.091173076 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:14:44 localhost podman[76765]: 2026-02-23 08:14:44.957559093 +0000 UTC m=+0.129457967 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 23 03:14:44 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:14:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:14:46 localhost systemd[1]: tmp-crun.iFimzD.mount: Deactivated successfully. Feb 23 03:14:46 localhost podman[76785]: 2026-02-23 08:14:46.91734653 +0000 UTC m=+0.093799076 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Feb 23 03:14:46 localhost podman[76785]: 2026-02-23 08:14:46.976218885 +0000 UTC m=+0.152671471 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13) Feb 23 03:14:46 localhost podman[76785]: unhealthy Feb 23 03:14:46 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:14:46 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:14:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:14:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:14:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:14:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:14:47 localhost podman[76808]: 2026-02-23 08:14:47.915450815 +0000 UTC m=+0.086469694 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:14:47 localhost podman[76808]: 2026-02-23 08:14:47.953764707 +0000 UTC m=+0.124783576 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3) Feb 23 03:14:47 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:14:47 localhost podman[76809]: 2026-02-23 08:14:47.981309582 +0000 UTC m=+0.150499646 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:14:48 localhost systemd[1]: tmp-crun.rxzvrp.mount: Deactivated successfully. Feb 23 03:14:48 localhost podman[76811]: 2026-02-23 08:14:48.03166645 +0000 UTC m=+0.195475591 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:14:48 localhost podman[76809]: 2026-02-23 08:14:48.04024215 +0000 UTC m=+0.209432204 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:07:47Z, tcib_managed=true, config_id=tripleo_step4, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:14:48 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:14:48 localhost podman[76811]: 2026-02-23 08:14:48.071181548 +0000 UTC m=+0.234990699 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, tcib_managed=true) Feb 23 03:14:48 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:14:48 localhost podman[76810]: 2026-02-23 08:14:48.12464962 +0000 UTC m=+0.289284825 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true) Feb 23 03:14:48 localhost podman[76810]: 2026-02-23 08:14:48.154131475 +0000 UTC m=+0.318766700 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:14:48 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:14:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:14:50 localhost podman[76896]: 2026-02-23 08:14:50.904993706 +0000 UTC m=+0.081637077 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:14:51 localhost podman[76896]: 2026-02-23 08:14:51.290385045 +0000 UTC m=+0.467028456 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:14:51 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:14:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:14:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:14:52 localhost podman[76919]: 2026-02-23 08:14:52.919507201 +0000 UTC m=+0.089709942 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, tcib_managed=true) Feb 23 03:14:52 localhost sshd[76955]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:14:52 localhost podman[76919]: 2026-02-23 08:14:52.975277163 +0000 UTC m=+0.145479904 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Feb 23 03:14:52 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:14:53 localhost podman[76920]: 2026-02-23 08:14:52.978943754 +0000 UTC m=+0.148082023 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vcs-type=git, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:14:53 localhost podman[76920]: 2026-02-23 08:14:53.061224529 +0000 UTC m=+0.230362718 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, release=1766032510, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64) Feb 23 03:14:53 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:14:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:14:54 localhost podman[76970]: 2026-02-23 08:14:54.523934208 +0000 UTC m=+0.089090634 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team) Feb 23 03:14:54 localhost podman[76970]: 2026-02-23 08:14:54.723320845 +0000 UTC m=+0.288477231 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:14:54 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:15:07 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:15:07 localhost recover_tripleo_nova_virtqemud[77000]: 61982 Feb 23 03:15:07 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:15:07 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:15:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:15:15 localhost systemd[1]: tmp-crun.OGEkLj.mount: Deactivated successfully. Feb 23 03:15:15 localhost podman[77001]: 2026-02-23 08:15:15.917404436 +0000 UTC m=+0.090120683 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git) Feb 23 03:15:15 localhost podman[77001]: 2026-02-23 08:15:15.931170788 +0000 UTC m=+0.103887055 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:15:15 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:15:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:15:17 localhost systemd[1]: tmp-crun.yrKcAM.mount: Deactivated successfully. Feb 23 03:15:17 localhost podman[77021]: 2026-02-23 08:15:17.917906793 +0000 UTC m=+0.092155152 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:15:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:15:18 localhost podman[77021]: 2026-02-23 08:15:18.006297886 +0000 UTC m=+0.180546205 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1766032510, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:15:18 localhost podman[77021]: unhealthy Feb 23 03:15:18 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:15:18 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:15:18 localhost systemd[1]: tmp-crun.gVYIzh.mount: Deactivated successfully. Feb 23 03:15:18 localhost podman[77043]: 2026-02-23 08:15:18.107178002 +0000 UTC m=+0.095360546 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, io.openshift.expose-services=, tcib_managed=true) Feb 23 03:15:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:15:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:15:18 localhost podman[77043]: 2026-02-23 08:15:18.158636245 +0000 UTC m=+0.146818789 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, container_name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:15:18 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:15:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:15:18 localhost podman[77063]: 2026-02-23 08:15:18.238394696 +0000 UTC m=+0.095581223 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:15:18 localhost podman[77063]: 2026-02-23 08:15:18.271484242 +0000 UTC m=+0.128670739 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:15:18 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:15:18 localhost podman[77064]: 2026-02-23 08:15:18.287755527 +0000 UTC m=+0.142557805 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z) Feb 23 03:15:18 localhost podman[77064]: 2026-02-23 08:15:18.299341856 +0000 UTC m=+0.154144144 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, architecture=x86_64, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 23 03:15:18 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:15:18 localhost podman[77084]: 2026-02-23 08:15:18.390044925 +0000 UTC m=+0.188061774 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5) Feb 23 03:15:18 localhost podman[77084]: 2026-02-23 08:15:18.42033319 +0000 UTC m=+0.218350019 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi) Feb 23 03:15:18 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:15:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:15:21 localhost podman[77136]: 2026-02-23 08:15:21.906586788 +0000 UTC m=+0.078839584 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 03:15:22 localhost podman[77136]: 2026-02-23 08:15:22.317585304 +0000 UTC m=+0.489838100 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git) Feb 23 03:15:22 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:15:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:15:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:15:23 localhost podman[77160]: 2026-02-23 08:15:23.909382253 +0000 UTC m=+0.083429748 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 23 03:15:23 localhost podman[77160]: 2026-02-23 08:15:23.935507507 +0000 UTC m=+0.109554982 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:15:23 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:15:24 localhost podman[77161]: 2026-02-23 08:15:24.021826667 +0000 UTC m=+0.190017860 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 23 03:15:24 localhost podman[77161]: 2026-02-23 08:15:24.090217486 +0000 UTC m=+0.258408709 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64) Feb 23 03:15:24 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:15:24 localhost sshd[77209]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:15:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:15:24 localhost systemd[1]: tmp-crun.yCvp0K.mount: Deactivated successfully. Feb 23 03:15:24 localhost podman[77211]: 2026-02-23 08:15:24.945441198 +0000 UTC m=+0.119048738 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:15:25 localhost podman[77211]: 2026-02-23 08:15:25.138325602 +0000 UTC m=+0.311933172 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:15:25 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:15:25 localhost sshd[77241]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:15:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:15:46 localhost podman[77320]: 2026-02-23 08:15:46.914738469 +0000 UTC m=+0.087442025 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.buildah.version=1.41.5) Feb 23 03:15:46 localhost podman[77320]: 2026-02-23 08:15:46.926808763 +0000 UTC m=+0.099512279 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:15:46 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:15:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:15:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:15:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:15:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:15:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:15:48 localhost podman[77341]: 2026-02-23 08:15:48.929563776 +0000 UTC m=+0.103967209 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z) Feb 23 03:15:49 localhost podman[77342]: 2026-02-23 08:15:49.009438679 +0000 UTC m=+0.177647260 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510) Feb 23 03:15:49 localhost podman[77340]: 2026-02-23 08:15:48.97934468 +0000 UTC m=+0.154278328 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:15:49 localhost podman[77340]: 2026-02-23 08:15:49.05979705 +0000 UTC m=+0.234730678 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Feb 23 03:15:49 localhost podman[77343]: 2026-02-23 08:15:49.075093816 +0000 UTC m=+0.242167934 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true) Feb 23 03:15:49 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:15:49 localhost podman[77341]: 2026-02-23 08:15:49.08511419 +0000 UTC m=+0.259517603 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 23 03:15:49 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:15:49 localhost podman[77347]: 2026-02-23 08:15:49.16181851 +0000 UTC m=+0.330223557 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:15:49 localhost podman[77342]: 2026-02-23 08:15:49.188112909 +0000 UTC m=+0.356321460 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:15:49 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:15:49 localhost podman[77343]: 2026-02-23 08:15:49.240294182 +0000 UTC m=+0.407368300 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=logrotate_crond) Feb 23 03:15:49 localhost podman[77347]: 2026-02-23 08:15:49.248330837 +0000 UTC m=+0.416735884 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13) Feb 23 03:15:49 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:15:49 localhost podman[77347]: unhealthy Feb 23 03:15:49 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:15:49 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:15:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:15:52 localhost podman[77453]: 2026-02-23 08:15:52.912232035 +0000 UTC m=+0.085484137 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 23 03:15:53 localhost podman[77453]: 2026-02-23 08:15:53.307429109 +0000 UTC m=+0.480681201 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1766032510, batch=17.1_20260112.1, architecture=x86_64) Feb 23 03:15:53 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:15:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:15:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:15:54 localhost systemd[1]: tmp-crun.OB8mQY.mount: Deactivated successfully. Feb 23 03:15:54 localhost podman[77476]: 2026-02-23 08:15:54.928835883 +0000 UTC m=+0.101347231 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:15:54 localhost podman[77476]: 2026-02-23 08:15:54.959224401 +0000 UTC m=+0.131735769 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:15:54 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:15:54 localhost podman[77477]: 2026-02-23 08:15:54.985663813 +0000 UTC m=+0.153674199 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510) Feb 23 03:15:55 localhost podman[77477]: 2026-02-23 08:15:55.034315055 +0000 UTC m=+0.202325451 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:56:19Z) Feb 23 03:15:55 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:15:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:15:55 localhost systemd[1]: tmp-crun.Bob56m.mount: Deactivated successfully. Feb 23 03:15:55 localhost podman[77524]: 2026-02-23 08:15:55.916295369 +0000 UTC m=+0.088768815 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, release=1766032510, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 23 03:15:56 localhost podman[77524]: 2026-02-23 08:15:56.07655892 +0000 UTC m=+0.249032356 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, container_name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:15:56 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:16:06 localhost sshd[77553]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:16:14 localhost sshd[77555]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:16:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:16:17 localhost podman[77560]: 2026-02-23 08:16:17.49528998 +0000 UTC m=+0.095333677 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:16:17 localhost podman[77560]: 2026-02-23 08:16:17.505610901 +0000 UTC m=+0.105654618 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Feb 23 03:16:17 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:16:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:16:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:16:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:16:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:16:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:16:19 localhost podman[77648]: 2026-02-23 08:16:19.928232668 +0000 UTC m=+0.096238397 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git) Feb 23 03:16:19 localhost podman[77648]: 2026-02-23 08:16:19.960314473 +0000 UTC m=+0.128320232 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute) Feb 23 03:16:19 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:16:19 localhost podman[77647]: 2026-02-23 08:16:19.97764677 +0000 UTC m=+0.147556537 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team) Feb 23 03:16:20 localhost podman[77647]: 2026-02-23 08:16:20.015387537 +0000 UTC m=+0.185297314 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:16:20 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:16:20 localhost podman[77650]: 2026-02-23 08:16:20.046790972 +0000 UTC m=+0.207038296 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 23 03:16:20 localhost podman[77649]: 2026-02-23 08:16:20.083212139 +0000 UTC m=+0.247116974 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, release=1766032510) Feb 23 03:16:20 localhost podman[77656]: 2026-02-23 08:16:20.14338753 +0000 UTC m=+0.301789298 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5) Feb 23 03:16:20 localhost podman[77649]: 2026-02-23 08:16:20.151141015 +0000 UTC m=+0.315045850 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:16:20 localhost podman[77650]: 2026-02-23 08:16:20.157846579 +0000 UTC m=+0.318093933 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=logrotate_crond) Feb 23 03:16:20 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:16:20 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:16:20 localhost podman[77656]: 2026-02-23 08:16:20.210302314 +0000 UTC m=+0.368704142 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 23 03:16:20 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:16:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:16:23 localhost systemd[1]: tmp-crun.iQNJWv.mount: Deactivated successfully. Feb 23 03:16:23 localhost podman[77782]: 2026-02-23 08:16:23.919348578 +0000 UTC m=+0.096794544 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:16:24 localhost podman[77782]: 2026-02-23 08:16:24.303973612 +0000 UTC m=+0.481419578 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z) Feb 23 03:16:24 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:16:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:16:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:16:25 localhost systemd[1]: tmp-crun.lN1p8B.mount: Deactivated successfully. Feb 23 03:16:25 localhost podman[77807]: 2026-02-23 08:16:25.923531015 +0000 UTC m=+0.093738341 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc.) Feb 23 03:16:25 localhost podman[77806]: 2026-02-23 08:16:25.970159873 +0000 UTC m=+0.142164184 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=ovn_controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com) Feb 23 03:16:25 localhost podman[77806]: 2026-02-23 08:16:25.996595506 +0000 UTC m=+0.168599817 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public) Feb 23 03:16:26 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:16:26 localhost podman[77807]: 2026-02-23 08:16:26.057136087 +0000 UTC m=+0.227343393 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:16:26 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:16:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:16:26 localhost podman[77853]: 2026-02-23 08:16:26.916699673 +0000 UTC m=+0.089783041 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, container_name=metrics_qdr, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5) Feb 23 03:16:27 localhost podman[77853]: 2026-02-23 08:16:27.151507771 +0000 UTC m=+0.324591169 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Feb 23 03:16:27 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:16:27 localhost systemd[1]: libpod-e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a.scope: Deactivated successfully. Feb 23 03:16:27 localhost podman[77880]: 2026-02-23 08:16:27.810434867 +0000 UTC m=+0.066883185 container died e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_wait_for_compute_service, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}) Feb 23 03:16:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a-userdata-shm.mount: Deactivated successfully. Feb 23 03:16:27 localhost systemd[1]: var-lib-containers-storage-overlay-32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16-merged.mount: Deactivated successfully. Feb 23 03:16:27 localhost podman[77880]: 2026-02-23 08:16:27.852483865 +0000 UTC m=+0.108932153 container cleanup e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true) Feb 23 03:16:27 localhost systemd[1]: libpod-conmon-e77fa343cc2d3fca32a3fb2fdb9f0766fbd1238a0f01ed862c66f4e1e684443a.scope: Deactivated successfully. Feb 23 03:16:27 localhost python3[75749]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=b5f04eda8e5f004a5ff6ec948b25cc1e --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Feb 23 03:16:28 localhost python3[77931]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:16:28 localhost python3[77947]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Feb 23 03:16:29 localhost python3[78008]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771834588.832133-118967-132864162348382/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:16:29 localhost python3[78024]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 03:16:29 localhost systemd[1]: Reloading. Feb 23 03:16:29 localhost systemd-sysv-generator[78054]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:16:29 localhost systemd-rc-local-generator[78049]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:16:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:16:30 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:16:30 localhost recover_tripleo_nova_virtqemud[78062]: 61982 Feb 23 03:16:30 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:16:30 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:16:30 localhost python3[78078]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:16:30 localhost systemd[1]: Reloading. Feb 23 03:16:31 localhost systemd-sysv-generator[78110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:16:31 localhost systemd-rc-local-generator[78105]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:16:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:16:31 localhost systemd[1]: Starting nova_compute container... Feb 23 03:16:31 localhost tripleo-start-podman-container[78118]: Creating additional drop-in dependency for "nova_compute" (c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442) Feb 23 03:16:31 localhost systemd[1]: Reloading. Feb 23 03:16:31 localhost systemd-rc-local-generator[78173]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:16:31 localhost systemd-sysv-generator[78177]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:16:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:16:31 localhost systemd[1]: Started nova_compute container. Feb 23 03:16:32 localhost python3[78215]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:16:33 localhost python3[78336]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005626463 step=5 update_config_hash_only=False Feb 23 03:16:34 localhost python3[78352]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 03:16:34 localhost python3[78368]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Feb 23 03:16:45 localhost sshd[78445]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:16:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:16:47 localhost systemd[1]: tmp-crun.0c1HGW.mount: Deactivated successfully. Feb 23 03:16:47 localhost podman[78447]: 2026-02-23 08:16:47.928572198 +0000 UTC m=+0.097199466 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-collectd-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-type=git, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=) Feb 23 03:16:47 localhost podman[78447]: 2026-02-23 08:16:47.967625226 +0000 UTC m=+0.136252514 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.) Feb 23 03:16:47 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:16:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:16:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:16:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:16:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:16:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:16:50 localhost systemd[1]: tmp-crun.jz71Tp.mount: Deactivated successfully. Feb 23 03:16:50 localhost podman[78466]: 2026-02-23 08:16:50.93108544 +0000 UTC m=+0.098135374 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Feb 23 03:16:50 localhost podman[78466]: 2026-02-23 08:16:50.959193985 +0000 UTC m=+0.126243969 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute) Feb 23 03:16:50 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:16:50 localhost podman[78467]: 2026-02-23 08:16:50.978926585 +0000 UTC m=+0.142619238 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible) Feb 23 03:16:51 localhost podman[78465]: 2026-02-23 08:16:51.02778556 +0000 UTC m=+0.197017501 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:16:51 localhost podman[78467]: 2026-02-23 08:16:51.032252776 +0000 UTC m=+0.195945369 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:16:51 localhost podman[78465]: 2026-02-23 08:16:51.041150596 +0000 UTC m=+0.210382487 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z) Feb 23 03:16:51 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:16:51 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:16:51 localhost podman[78468]: 2026-02-23 08:16:51.088953 +0000 UTC m=+0.250689453 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-cron) Feb 23 03:16:51 localhost podman[78470]: 2026-02-23 08:16:50.949920893 +0000 UTC m=+0.104754656 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_compute) Feb 23 03:16:51 localhost podman[78468]: 2026-02-23 08:16:51.125190912 +0000 UTC m=+0.286927295 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:16:51 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:16:51 localhost podman[78470]: 2026-02-23 08:16:51.180530634 +0000 UTC m=+0.335364397 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, container_name=nova_compute, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:16:51 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:16:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:16:54 localhost podman[78581]: 2026-02-23 08:16:54.903945395 +0000 UTC m=+0.079298221 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:16:55 localhost podman[78581]: 2026-02-23 08:16:55.291388745 +0000 UTC m=+0.466741561 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:16:55 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:16:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:16:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:16:56 localhost systemd[1]: tmp-crun.1wTx0l.mount: Deactivated successfully. Feb 23 03:16:56 localhost podman[78604]: 2026-02-23 08:16:56.927951906 +0000 UTC m=+0.094802554 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z) Feb 23 03:16:56 localhost podman[78605]: 2026-02-23 08:16:56.977843042 +0000 UTC m=+0.144184424 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 23 03:16:57 localhost podman[78604]: 2026-02-23 08:16:57.031387101 +0000 UTC m=+0.198237749 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:16:57 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:16:57 localhost podman[78605]: 2026-02-23 08:16:57.082295528 +0000 UTC m=+0.248636920 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1) Feb 23 03:16:57 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:16:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:16:57 localhost podman[78652]: 2026-02-23 08:16:57.89373393 +0000 UTC m=+0.071585876 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1) Feb 23 03:16:58 localhost podman[78652]: 2026-02-23 08:16:58.084381266 +0000 UTC m=+0.262233152 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 23 03:16:58 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:16:59 localhost sshd[78681]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:16:59 localhost systemd-logind[759]: New session 33 of user zuul. Feb 23 03:16:59 localhost systemd[1]: Started Session 33 of User zuul. Feb 23 03:17:00 localhost python3[78790]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 03:17:04 localhost sshd[78977]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:17:07 localhost python3[79055]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Feb 23 03:17:15 localhost python3[79148]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Feb 23 03:17:15 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Feb 23 03:17:15 localhost systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation. Feb 23 03:17:15 localhost systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 03:17:15 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 03:17:15 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 03:17:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:17:18 localhost systemd[1]: tmp-crun.RoIJ2P.mount: Deactivated successfully. Feb 23 03:17:18 localhost podman[79216]: 2026-02-23 08:17:18.923422639 +0000 UTC m=+0.089117700 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd) Feb 23 03:17:18 localhost podman[79216]: 2026-02-23 08:17:18.939301932 +0000 UTC m=+0.104996953 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true) Feb 23 03:17:18 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:17:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:17:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:17:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:17:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:17:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:17:21 localhost podman[79236]: 2026-02-23 08:17:21.932027776 +0000 UTC m=+0.104492568 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:17:21 localhost podman[79236]: 2026-02-23 08:17:21.970260509 +0000 UTC m=+0.142725341 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, version=17.1.13, container_name=iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container) Feb 23 03:17:21 localhost podman[79239]: 2026-02-23 08:17:21.984494151 +0000 UTC m=+0.147267089 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step4) Feb 23 03:17:21 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:17:22 localhost podman[79239]: 2026-02-23 08:17:22.021352762 +0000 UTC m=+0.184125680 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Feb 23 03:17:22 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:17:22 localhost podman[79237]: 2026-02-23 08:17:22.039330218 +0000 UTC m=+0.206893861 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Feb 23 03:17:22 localhost podman[79237]: 2026-02-23 08:17:22.076312373 +0000 UTC m=+0.243875996 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 03:17:22 localhost podman[79238]: 2026-02-23 08:17:22.085891254 +0000 UTC m=+0.250280270 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, url=https://www.redhat.com) Feb 23 03:17:22 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:17:22 localhost podman[79238]: 2026-02-23 08:17:22.140941108 +0000 UTC m=+0.305330094 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:17:22 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:17:22 localhost podman[79245]: 2026-02-23 08:17:22.142717932 +0000 UTC m=+0.298694062 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:17:22 localhost podman[79245]: 2026-02-23 08:17:22.223646612 +0000 UTC m=+0.379622752 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1) Feb 23 03:17:22 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:17:24 localhost sshd[79352]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:17:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:17:25 localhost podman[79354]: 2026-02-23 08:17:25.943757131 +0000 UTC m=+0.087803402 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:17:26 localhost podman[79354]: 2026-02-23 08:17:26.351383756 +0000 UTC m=+0.495430097 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1) Feb 23 03:17:26 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:17:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:17:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:17:27 localhost podman[79377]: 2026-02-23 08:17:27.911733514 +0000 UTC m=+0.085745060 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1) Feb 23 03:17:27 localhost podman[79378]: 2026-02-23 08:17:27.967683125 +0000 UTC m=+0.138008708 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Feb 23 03:17:27 localhost podman[79377]: 2026-02-23 08:17:27.98864758 +0000 UTC m=+0.162659086 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:17:28 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:17:28 localhost podman[79378]: 2026-02-23 08:17:28.043564941 +0000 UTC m=+0.213890524 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git) Feb 23 03:17:28 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:17:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:17:28 localhost systemd[1]: tmp-crun.K8mHri.mount: Deactivated successfully. Feb 23 03:17:28 localhost podman[79426]: 2026-02-23 08:17:28.908978603 +0000 UTC m=+0.081931273 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64) Feb 23 03:17:29 localhost podman[79426]: 2026-02-23 08:17:29.116553821 +0000 UTC m=+0.289506401 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Feb 23 03:17:29 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:17:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:17:47 localhost recover_tripleo_nova_virtqemud[79534]: 61982 Feb 23 03:17:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:17:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:17:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:17:49 localhost podman[79535]: 2026-02-23 08:17:49.925261486 +0000 UTC m=+0.094108247 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, version=17.1.13) Feb 23 03:17:49 localhost podman[79535]: 2026-02-23 08:17:49.96762928 +0000 UTC m=+0.136476111 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:17:49 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:17:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:17:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:17:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:17:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:17:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:17:52 localhost podman[79555]: 2026-02-23 08:17:52.925827276 +0000 UTC m=+0.096812900 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc.) Feb 23 03:17:52 localhost podman[79555]: 2026-02-23 08:17:52.963737172 +0000 UTC m=+0.134723016 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, container_name=iscsid, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13) Feb 23 03:17:52 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:17:52 localhost podman[79558]: 2026-02-23 08:17:52.986237345 +0000 UTC m=+0.147877731 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:17:53 localhost podman[79558]: 2026-02-23 08:17:53.021116839 +0000 UTC m=+0.182757255 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:17:53 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:17:53 localhost podman[79557]: 2026-02-23 08:17:53.035531172 +0000 UTC m=+0.198886971 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc.) Feb 23 03:17:53 localhost podman[79557]: 2026-02-23 08:17:53.072333675 +0000 UTC m=+0.235689474 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:17:53 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:17:53 localhost podman[79564]: 2026-02-23 08:17:53.09069082 +0000 UTC m=+0.248716465 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, batch=17.1_20260112.1, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:17:53 localhost podman[79564]: 2026-02-23 08:17:53.122172329 +0000 UTC m=+0.280197964 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:17:53 localhost podman[79556]: 2026-02-23 08:17:53.133449896 +0000 UTC m=+0.298457407 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:17:53 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:17:53 localhost podman[79556]: 2026-02-23 08:17:53.195571797 +0000 UTC m=+0.360579358 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:17:53 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:17:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:17:56 localhost podman[79669]: 2026-02-23 08:17:56.900017098 +0000 UTC m=+0.077342782 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, vcs-type=git) Feb 23 03:17:57 localhost podman[79669]: 2026-02-23 08:17:57.286351166 +0000 UTC m=+0.463676860 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:17:57 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:17:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:17:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:17:58 localhost podman[79691]: 2026-02-23 08:17:58.916427551 +0000 UTC m=+0.084649156 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13) Feb 23 03:17:58 localhost podman[79691]: 2026-02-23 08:17:58.969464523 +0000 UTC m=+0.137686128 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, container_name=ovn_controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:17:58 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:17:59 localhost sshd[79734]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:17:59 localhost podman[79692]: 2026-02-23 08:17:58.973810387 +0000 UTC m=+0.137779720 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public) Feb 23 03:17:59 localhost podman[79692]: 2026-02-23 08:17:59.057224374 +0000 UTC m=+0.221193757 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, distribution-scope=public, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 23 03:17:59 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:17:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:17:59 localhost systemd[1]: tmp-crun.oCvP5e.mount: Deactivated successfully. Feb 23 03:17:59 localhost podman[79741]: 2026-02-23 08:17:59.445307207 +0000 UTC m=+0.097365998 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 23 03:17:59 localhost podman[79741]: 2026-02-23 08:17:59.703730509 +0000 UTC m=+0.355789350 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, release=1766032510, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true) Feb 23 03:17:59 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:18:03 localhost sshd[79770]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:18:15 localhost systemd[1]: session-33.scope: Deactivated successfully. Feb 23 03:18:15 localhost systemd[1]: session-33.scope: Consumed 6.062s CPU time. Feb 23 03:18:15 localhost systemd-logind[759]: Session 33 logged out. Waiting for processes to exit. Feb 23 03:18:15 localhost systemd-logind[759]: Removed session 33. Feb 23 03:18:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:18:20 localhost podman[79817]: 2026-02-23 08:18:20.917999246 +0000 UTC m=+0.091415745 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_id=tripleo_step3, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, managed_by=tripleo_ansible) Feb 23 03:18:20 localhost podman[79817]: 2026-02-23 08:18:20.927774307 +0000 UTC m=+0.101190796 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:10:15Z) Feb 23 03:18:20 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:18:23 localhost sshd[79837]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:18:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:18:23 localhost systemd-logind[759]: New session 34 of user zuul. Feb 23 03:18:23 localhost systemd[1]: Started Session 34 of User zuul. Feb 23 03:18:23 localhost systemd[1]: tmp-crun.N3d5pf.mount: Deactivated successfully. Feb 23 03:18:23 localhost podman[79840]: 2026-02-23 08:18:23.405054423 +0000 UTC m=+0.098515933 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, config_id=tripleo_step4, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:18:23 localhost podman[79839]: 2026-02-23 08:18:23.468349911 +0000 UTC m=+0.163668648 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:18:23 localhost podman[79840]: 2026-02-23 08:18:23.489356727 +0000 UTC m=+0.182818237 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:18:23 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:18:23 localhost podman[79843]: 2026-02-23 08:18:23.569837224 +0000 UTC m=+0.254364259 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:18:23 localhost podman[79839]: 2026-02-23 08:18:23.579584853 +0000 UTC m=+0.274903610 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public) Feb 23 03:18:23 localhost podman[79841]: 2026-02-23 08:18:23.429232457 +0000 UTC m=+0.107826220 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:18:23 localhost podman[79841]: 2026-02-23 08:18:23.613330092 +0000 UTC m=+0.291923905 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible) Feb 23 03:18:23 localhost podman[79842]: 2026-02-23 08:18:23.620356659 +0000 UTC m=+0.307442293 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=logrotate_crond, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:18:23 localhost podman[79842]: 2026-02-23 08:18:23.624585058 +0000 UTC m=+0.311670672 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, version=17.1.13, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:18:23 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:18:23 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:18:23 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:18:23 localhost podman[79843]: 2026-02-23 08:18:23.675290348 +0000 UTC m=+0.359817383 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 23 03:18:23 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:18:23 localhost python3[79949]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 03:18:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:18:27 localhost systemd[1]: tmp-crun.BsxNpl.mount: Deactivated successfully. Feb 23 03:18:27 localhost podman[79972]: 2026-02-23 08:18:27.921689185 +0000 UTC m=+0.093575827 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, version=17.1.13, container_name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:18:28 localhost podman[79972]: 2026-02-23 08:18:28.305021034 +0000 UTC m=+0.476907686 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc.) Feb 23 03:18:28 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:18:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:18:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:18:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:18:29 localhost podman[79998]: 2026-02-23 08:18:29.94882503 +0000 UTC m=+0.120070433 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, architecture=x86_64, distribution-scope=public, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:18:30 localhost systemd[1]: tmp-crun.qwNFhJ.mount: Deactivated successfully. Feb 23 03:18:30 localhost podman[79997]: 2026-02-23 08:18:30.052127992 +0000 UTC m=+0.223595743 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, url=https://www.redhat.com, architecture=x86_64) Feb 23 03:18:30 localhost podman[79996]: 2026-02-23 08:18:30.015471447 +0000 UTC m=+0.189920857 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:18:30 localhost podman[79996]: 2026-02-23 08:18:30.094658022 +0000 UTC m=+0.269107402 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Feb 23 03:18:30 localhost podman[79997]: 2026-02-23 08:18:30.103426715 +0000 UTC m=+0.274894496 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:56:19Z) Feb 23 03:18:30 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:18:30 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:18:30 localhost podman[79998]: 2026-02-23 08:18:30.172283046 +0000 UTC m=+0.343528409 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 23 03:18:30 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:18:41 localhost sshd[80133]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:18:50 localhost sshd[80150]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:18:50 localhost python3[80166]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Feb 23 03:18:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:18:51 localhost systemd[1]: tmp-crun.ya2VV0.mount: Deactivated successfully. Feb 23 03:18:51 localhost podman[80169]: 2026-02-23 08:18:51.739376371 +0000 UTC m=+0.092900465 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, container_name=collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, tcib_managed=true) Feb 23 03:18:51 localhost podman[80169]: 2026-02-23 08:18:51.750445941 +0000 UTC m=+0.103970085 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, container_name=collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:18:51 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:18:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:18:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:18:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:18:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:18:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:18:53 localhost podman[80192]: 2026-02-23 08:18:53.93694345 +0000 UTC m=+0.095532283 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi) Feb 23 03:18:53 localhost podman[80190]: 2026-02-23 08:18:53.98781242 +0000 UTC m=+0.152759116 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, release=1766032510, build-date=2026-01-12T22:34:43Z) Feb 23 03:18:54 localhost podman[80190]: 2026-02-23 08:18:54.000320677 +0000 UTC m=+0.165267343 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:18:54 localhost podman[80192]: 2026-02-23 08:18:54.019084735 +0000 UTC m=+0.177673528 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_id=tripleo_step4) Feb 23 03:18:54 localhost systemd[1]: tmp-crun.Jzu0Ds.mount: Deactivated successfully. Feb 23 03:18:54 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:18:54 localhost podman[80191]: 2026-02-23 08:18:54.041018498 +0000 UTC m=+0.202828949 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true) Feb 23 03:18:54 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:18:54 localhost podman[80199]: 2026-02-23 08:18:54.097854647 +0000 UTC m=+0.245507764 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, vcs-type=git, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com) Feb 23 03:18:54 localhost podman[80191]: 2026-02-23 08:18:54.102360168 +0000 UTC m=+0.264170639 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true) Feb 23 03:18:54 localhost podman[80199]: 2026-02-23 08:18:54.129198654 +0000 UTC m=+0.276851751 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:18:54 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:18:54 localhost podman[80193]: 2026-02-23 08:18:54.146751511 +0000 UTC m=+0.302009032 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:18:54 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:18:54 localhost podman[80193]: 2026-02-23 08:18:54.208419792 +0000 UTC m=+0.363677293 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z) Feb 23 03:18:54 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:18:54 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 03:18:54 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 03:18:54 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 03:18:55 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 03:18:55 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 03:18:55 localhost systemd[1]: run-r18d9633a04e94dbfa7cfbb53935b3b73.service: Deactivated successfully. Feb 23 03:18:55 localhost systemd[1]: run-rb9280c561b0b4d5d88eacf720b321e1b.service: Deactivated successfully. Feb 23 03:18:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:18:58 localhost podman[80452]: 2026-02-23 08:18:58.911618371 +0000 UTC m=+0.083759628 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 23 03:18:59 localhost podman[80452]: 2026-02-23 08:18:59.301261332 +0000 UTC m=+0.473402539 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64) Feb 23 03:18:59 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:19:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:19:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:19:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:19:00 localhost systemd[1]: tmp-crun.sEBIiq.mount: Deactivated successfully. Feb 23 03:19:00 localhost podman[80476]: 2026-02-23 08:19:00.924664656 +0000 UTC m=+0.093606099 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=) Feb 23 03:19:00 localhost podman[80475]: 2026-02-23 08:19:00.975476903 +0000 UTC m=+0.145164181 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:19:01 localhost podman[80475]: 2026-02-23 08:19:01.003267842 +0000 UTC m=+0.172955170 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller) Feb 23 03:19:01 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:19:01 localhost podman[80477]: 2026-02-23 08:19:01.032473859 +0000 UTC m=+0.195744072 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1) Feb 23 03:19:01 localhost podman[80476]: 2026-02-23 08:19:01.060822386 +0000 UTC m=+0.229763869 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64) Feb 23 03:19:01 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:19:01 localhost podman[80477]: 2026-02-23 08:19:01.223233972 +0000 UTC m=+0.386504155 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510) Feb 23 03:19:01 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:19:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:19:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 4530 writes, 20K keys, 4530 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4530 writes, 464 syncs, 9.76 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:19:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:19:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 5013 writes, 22K keys, 5013 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5013 writes, 561 syncs, 8.94 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:19:20 localhost sshd[80573]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:19:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:19:21 localhost podman[80598]: 2026-02-23 08:19:21.914476203 +0000 UTC m=+0.080955256 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., release=1766032510, vcs-type=git) Feb 23 03:19:21 localhost podman[80598]: 2026-02-23 08:19:21.929175124 +0000 UTC m=+0.095654187 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step3, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Feb 23 03:19:21 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:19:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:19:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:19:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:19:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:19:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:19:24 localhost systemd[1]: tmp-crun.aWIijc.mount: Deactivated successfully. Feb 23 03:19:24 localhost podman[80624]: 2026-02-23 08:19:24.940147701 +0000 UTC m=+0.098308136 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true) Feb 23 03:19:24 localhost podman[80624]: 2026-02-23 08:19:24.974283982 +0000 UTC m=+0.132444447 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:19:24 localhost podman[80619]: 2026-02-23 08:19:24.988276589 +0000 UTC m=+0.155998154 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible) Feb 23 03:19:24 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:19:25 localhost podman[80619]: 2026-02-23 08:19:25.030408367 +0000 UTC m=+0.198129942 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:19:25 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:19:25 localhost podman[80621]: 2026-02-23 08:19:25.03680071 +0000 UTC m=+0.201058459 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Feb 23 03:19:25 localhost podman[80620]: 2026-02-23 08:19:25.097463977 +0000 UTC m=+0.264261110 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:19:25 localhost podman[80622]: 2026-02-23 08:19:25.15471182 +0000 UTC m=+0.312808853 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true) Feb 23 03:19:25 localhost podman[80621]: 2026-02-23 08:19:25.171270503 +0000 UTC m=+0.335528272 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, batch=17.1_20260112.1) Feb 23 03:19:25 localhost podman[80620]: 2026-02-23 08:19:25.179929813 +0000 UTC m=+0.346726976 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 23 03:19:25 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:19:25 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:19:25 localhost podman[80622]: 2026-02-23 08:19:25.193282969 +0000 UTC m=+0.351380022 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:19:25 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:19:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:19:29 localhost podman[80738]: 2026-02-23 08:19:29.915340951 +0000 UTC m=+0.090807416 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=nova_migration_target) Feb 23 03:19:30 localhost python3[80776]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 03:19:30 localhost podman[80738]: 2026-02-23 08:19:30.356397408 +0000 UTC m=+0.531863873 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git) Feb 23 03:19:30 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:19:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:19:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:19:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:19:31 localhost podman[80780]: 2026-02-23 08:19:31.926009506 +0000 UTC m=+0.093908380 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, container_name=ovn_controller, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 23 03:19:31 localhost systemd[1]: tmp-crun.7wcZPK.mount: Deactivated successfully. Feb 23 03:19:31 localhost podman[80781]: 2026-02-23 08:19:31.970942863 +0000 UTC m=+0.137474121 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 23 03:19:32 localhost podman[80780]: 2026-02-23 08:19:32.024668313 +0000 UTC m=+0.192567237 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, distribution-scope=public, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64) Feb 23 03:19:32 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:19:32 localhost podman[80781]: 2026-02-23 08:19:32.052664963 +0000 UTC m=+0.219196221 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 23 03:19:32 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:19:32 localhost podman[80782]: 2026-02-23 08:19:32.034372681 +0000 UTC m=+0.196926625 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, release=1766032510, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Feb 23 03:19:32 localhost podman[80782]: 2026-02-23 08:19:32.257305795 +0000 UTC m=+0.419859659 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:19:32 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:19:33 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 03:19:33 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 03:19:37 localhost sshd[81040]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:19:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:19:47 localhost recover_tripleo_nova_virtqemud[81171]: 61982 Feb 23 03:19:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:19:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:19:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:19:52 localhost systemd[1]: tmp-crun.KgZfbP.mount: Deactivated successfully. Feb 23 03:19:52 localhost podman[81172]: 2026-02-23 08:19:52.932519788 +0000 UTC m=+0.103587501 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.) Feb 23 03:19:52 localhost podman[81172]: 2026-02-23 08:19:52.943848854 +0000 UTC m=+0.114916537 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, version=17.1.13, config_id=tripleo_step3, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Feb 23 03:19:52 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:19:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:19:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:19:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:19:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:19:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:19:55 localhost podman[81194]: 2026-02-23 08:19:55.893161205 +0000 UTC m=+0.071438838 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:19:55 localhost podman[81196]: 2026-02-23 08:19:55.953048216 +0000 UTC m=+0.125305232 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13) Feb 23 03:19:55 localhost podman[81195]: 2026-02-23 08:19:55.935014495 +0000 UTC m=+0.104805089 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, release=1766032510, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 23 03:19:56 localhost podman[81197]: 2026-02-23 08:19:56.026987236 +0000 UTC m=+0.195041699 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.5, container_name=logrotate_crond, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:19:56 localhost podman[81194]: 2026-02-23 08:19:56.031553583 +0000 UTC m=+0.209831266 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:19:56 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:19:56 localhost podman[81196]: 2026-02-23 08:19:56.054231215 +0000 UTC m=+0.226488261 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4) Feb 23 03:19:56 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:19:56 localhost podman[81203]: 2026-02-23 08:19:56.00593439 +0000 UTC m=+0.173316025 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:19:56 localhost podman[81197]: 2026-02-23 08:19:56.110176525 +0000 UTC m=+0.278231038 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, version=17.1.13, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:19:56 localhost podman[81195]: 2026-02-23 08:19:56.115375708 +0000 UTC m=+0.285166252 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:07:47Z) Feb 23 03:19:56 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:19:56 localhost podman[81203]: 2026-02-23 08:19:56.135626351 +0000 UTC m=+0.303008006 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, architecture=x86_64, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:19:56 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:19:56 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:20:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:20:00 localhost systemd[1]: tmp-crun.r9jBjk.mount: Deactivated successfully. Feb 23 03:20:00 localhost podman[81309]: 2026-02-23 08:20:00.915586571 +0000 UTC m=+0.089782272 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 03:20:01 localhost sshd[81333]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:20:01 localhost podman[81309]: 2026-02-23 08:20:01.321030166 +0000 UTC m=+0.495225837 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target) Feb 23 03:20:01 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:20:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:20:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:20:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:20:02 localhost systemd[1]: tmp-crun.sOrWRS.mount: Deactivated successfully. Feb 23 03:20:02 localhost podman[81335]: 2026-02-23 08:20:02.928673938 +0000 UTC m=+0.099824464 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4) Feb 23 03:20:02 localhost podman[81336]: 2026-02-23 08:20:02.981402865 +0000 UTC m=+0.149563139 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, version=17.1.13) Feb 23 03:20:03 localhost podman[81337]: 2026-02-23 08:20:03.027896547 +0000 UTC m=+0.193054497 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:20:03 localhost podman[81335]: 2026-02-23 08:20:03.034627273 +0000 UTC m=+0.205777799 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 23 03:20:03 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:20:03 localhost podman[81336]: 2026-02-23 08:20:03.05450781 +0000 UTC m=+0.222668154 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z) Feb 23 03:20:03 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:20:03 localhost podman[81337]: 2026-02-23 08:20:03.233324431 +0000 UTC m=+0.398482331 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_id=tripleo_step1, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:20:03 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:20:03 localhost systemd[1]: tmp-crun.Vcr12f.mount: Deactivated successfully. Feb 23 03:20:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:20:23 localhost podman[81453]: 2026-02-23 08:20:23.909174826 +0000 UTC m=+0.088140444 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1) Feb 23 03:20:23 localhost podman[81453]: 2026-02-23 08:20:23.924343403 +0000 UTC m=+0.103308991 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, container_name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5) Feb 23 03:20:23 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:20:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:20:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:20:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:20:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:20:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:20:26 localhost podman[81493]: 2026-02-23 08:20:26.585113906 +0000 UTC m=+0.095905585 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, build-date=2026-01-12T23:07:30Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com) Feb 23 03:20:26 localhost podman[81491]: 2026-02-23 08:20:26.637541975 +0000 UTC m=+0.155368259 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vcs-type=git, container_name=iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible) Feb 23 03:20:26 localhost podman[81491]: 2026-02-23 08:20:26.646078008 +0000 UTC m=+0.163904282 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5) Feb 23 03:20:26 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:20:26 localhost python3[81490]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 03:20:26 localhost podman[81494]: 2026-02-23 08:20:26.68794483 +0000 UTC m=+0.195269717 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, version=17.1.13, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:20:26 localhost podman[81500]: 2026-02-23 08:20:26.737682704 +0000 UTC m=+0.243610857 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Feb 23 03:20:26 localhost podman[81500]: 2026-02-23 08:20:26.796816728 +0000 UTC m=+0.302744941 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:20:26 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:20:26 localhost podman[81493]: 2026-02-23 08:20:26.817736829 +0000 UTC m=+0.328528478 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13) Feb 23 03:20:26 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:20:26 localhost podman[81492]: 2026-02-23 08:20:26.799186624 +0000 UTC m=+0.314088764 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container) Feb 23 03:20:26 localhost podman[81494]: 2026-02-23 08:20:26.870174408 +0000 UTC m=+0.377499315 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-cron-container) Feb 23 03:20:26 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:20:26 localhost podman[81492]: 2026-02-23 08:20:26.883906438 +0000 UTC m=+0.398808588 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:20:26 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:20:29 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 03:20:29 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 03:20:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:20:31 localhost systemd[1]: tmp-crun.oLuVa7.mount: Deactivated successfully. Feb 23 03:20:31 localhost podman[81736]: 2026-02-23 08:20:31.918932874 +0000 UTC m=+0.095018516 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:20:32 localhost podman[81736]: 2026-02-23 08:20:32.341359928 +0000 UTC m=+0.517445560 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:20:32 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:20:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:20:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:20:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:20:33 localhost podman[81816]: 2026-02-23 08:20:33.919032258 +0000 UTC m=+0.090882133 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:20:33 localhost podman[81817]: 2026-02-23 08:20:33.967753799 +0000 UTC m=+0.136687951 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, version=17.1.13, container_name=ovn_metadata_agent, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:20:33 localhost podman[81816]: 2026-02-23 08:20:33.974454093 +0000 UTC m=+0.146303978 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:20:33 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:20:34 localhost podman[81818]: 2026-02-23 08:20:34.033975741 +0000 UTC m=+0.198792170 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, distribution-scope=public, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 23 03:20:34 localhost podman[81817]: 2026-02-23 08:20:34.045580133 +0000 UTC m=+0.214514285 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:20:34 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:20:34 localhost podman[81818]: 2026-02-23 08:20:34.23555144 +0000 UTC m=+0.400367859 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5) Feb 23 03:20:34 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:20:35 localhost sshd[81892]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:20:42 localhost sshd[81894]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:20:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:20:54 localhost podman[81972]: 2026-02-23 08:20:54.937378432 +0000 UTC m=+0.097345520 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, release=1766032510, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 23 03:20:54 localhost podman[81972]: 2026-02-23 08:20:54.953135799 +0000 UTC m=+0.113102867 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 23 03:20:54 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:20:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:20:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:20:56 localhost podman[81993]: 2026-02-23 08:20:56.914133721 +0000 UTC m=+0.088500602 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, architecture=x86_64, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:20:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:20:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:20:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:20:56 localhost podman[81993]: 2026-02-23 08:20:56.951950684 +0000 UTC m=+0.126317585 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, release=1766032510) Feb 23 03:20:56 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:20:56 localhost podman[81994]: 2026-02-23 08:20:56.968481244 +0000 UTC m=+0.137338471 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:20:57 localhost podman[81994]: 2026-02-23 08:20:57.003303903 +0000 UTC m=+0.172161120 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step5) Feb 23 03:20:57 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:20:57 localhost systemd[1]: tmp-crun.DUYInV.mount: Deactivated successfully. Feb 23 03:20:57 localhost podman[82025]: 2026-02-23 08:20:57.092244557 +0000 UTC m=+0.147453860 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Feb 23 03:20:57 localhost podman[82025]: 2026-02-23 08:20:57.126704673 +0000 UTC m=+0.181913946 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:20:57 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:20:57 localhost podman[82023]: 2026-02-23 08:20:57.152698413 +0000 UTC m=+0.213604266 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:20:57 localhost podman[82023]: 2026-02-23 08:20:57.188382488 +0000 UTC m=+0.249288371 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:20:57 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:20:57 localhost podman[82024]: 2026-02-23 08:20:57.191734134 +0000 UTC m=+0.249187228 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git) Feb 23 03:20:57 localhost podman[82024]: 2026-02-23 08:20:57.277380075 +0000 UTC m=+0.334833149 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible) Feb 23 03:20:57 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:21:01 localhost python3[82120]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Feb 23 03:21:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:21:02 localhost podman[82121]: 2026-02-23 08:21:02.918133129 +0000 UTC m=+0.086523309 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:21:03 localhost podman[82121]: 2026-02-23 08:21:03.310322785 +0000 UTC m=+0.478712965 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:21:03 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:21:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:21:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:21:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:21:04 localhost systemd[1]: tmp-crun.IbM3pm.mount: Deactivated successfully. Feb 23 03:21:04 localhost podman[82146]: 2026-02-23 08:21:04.920917259 +0000 UTC m=+0.093770417 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team) Feb 23 03:21:04 localhost podman[82145]: 2026-02-23 08:21:04.969960536 +0000 UTC m=+0.144979892 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:21:04 localhost podman[82145]: 2026-02-23 08:21:04.99927443 +0000 UTC m=+0.174293826 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:36:40Z, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:21:05 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:21:05 localhost podman[82146]: 2026-02-23 08:21:05.050307099 +0000 UTC m=+0.223160257 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:21:05 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:21:05 localhost podman[82147]: 2026-02-23 08:21:05.128400041 +0000 UTC m=+0.297552232 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 23 03:21:05 localhost podman[82147]: 2026-02-23 08:21:05.358424464 +0000 UTC m=+0.527576675 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:21:05 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:21:24 localhost sshd[82241]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:21:25 localhost sshd[82268]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:21:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:21:25 localhost podman[82270]: 2026-02-23 08:21:25.908082505 +0000 UTC m=+0.081652526 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:21:25 localhost podman[82270]: 2026-02-23 08:21:25.920405653 +0000 UTC m=+0.093975664 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 23 03:21:25 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:21:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:21:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:21:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:21:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:21:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:21:27 localhost systemd[1]: tmp-crun.WXLoHE.mount: Deactivated successfully. Feb 23 03:21:27 localhost podman[82289]: 2026-02-23 08:21:27.933750955 +0000 UTC m=+0.103590037 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 23 03:21:27 localhost podman[82289]: 2026-02-23 08:21:27.965473655 +0000 UTC m=+0.135312737 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, container_name=ceilometer_agent_compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:21:27 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:21:27 localhost podman[82288]: 2026-02-23 08:21:27.978023602 +0000 UTC m=+0.152324315 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., container_name=iscsid, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3) Feb 23 03:21:28 localhost podman[82288]: 2026-02-23 08:21:28.016458193 +0000 UTC m=+0.190758846 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 23 03:21:28 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:21:28 localhost podman[82297]: 2026-02-23 08:21:28.039246312 +0000 UTC m=+0.200990779 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git) Feb 23 03:21:28 localhost podman[82291]: 2026-02-23 08:21:28.085066436 +0000 UTC m=+0.249738895 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 23 03:21:28 localhost podman[82291]: 2026-02-23 08:21:28.096171577 +0000 UTC m=+0.260844036 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, container_name=logrotate_crond, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public) Feb 23 03:21:28 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:21:28 localhost podman[82290]: 2026-02-23 08:21:28.193865437 +0000 UTC m=+0.362350547 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:21:28 localhost podman[82297]: 2026-02-23 08:21:28.206154074 +0000 UTC m=+0.367898591 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1766032510) Feb 23 03:21:28 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:21:28 localhost podman[82290]: 2026-02-23 08:21:28.233512237 +0000 UTC m=+0.401997397 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:21:28 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:21:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:21:33 localhost systemd[1]: tmp-crun.bV5F5G.mount: Deactivated successfully. Feb 23 03:21:33 localhost podman[82404]: 2026-02-23 08:21:33.91922999 +0000 UTC m=+0.091489636 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4) Feb 23 03:21:34 localhost podman[82404]: 2026-02-23 08:21:34.307407519 +0000 UTC m=+0.479667175 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, container_name=nova_migration_target, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:21:34 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:21:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:21:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:21:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:21:35 localhost systemd[1]: tmp-crun.4enAOv.mount: Deactivated successfully. Feb 23 03:21:35 localhost podman[82429]: 2026-02-23 08:21:35.933086167 +0000 UTC m=+0.098729354 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Feb 23 03:21:35 localhost systemd[1]: tmp-crun.LyIpLx.mount: Deactivated successfully. Feb 23 03:21:35 localhost podman[82427]: 2026-02-23 08:21:35.977323302 +0000 UTC m=+0.145579711 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13) Feb 23 03:21:36 localhost podman[82427]: 2026-02-23 08:21:36.004256942 +0000 UTC m=+0.172513381 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public) Feb 23 03:21:36 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:21:36 localhost podman[82428]: 2026-02-23 08:21:36.024700806 +0000 UTC m=+0.191979284 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:21:36 localhost podman[82428]: 2026-02-23 08:21:36.065490193 +0000 UTC m=+0.232768701 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:21:36 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:21:36 localhost podman[82429]: 2026-02-23 08:21:36.153488067 +0000 UTC m=+0.319131294 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, build-date=2026-01-12T22:10:14Z, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:21:36 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:21:46 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:21:46 localhost recover_tripleo_nova_virtqemud[82517]: 61982 Feb 23 03:21:46 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:21:46 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:21:47 localhost podman[82606]: 2026-02-23 08:21:47.844412647 +0000 UTC m=+0.139842875 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.openshift.expose-services=, vcs-type=git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, version=7, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 03:21:48 localhost podman[82606]: 2026-02-23 08:21:48.009786215 +0000 UTC m=+0.305216453 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, version=7, CEPH_POINT_RELEASE=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, release=1770267347, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main) Feb 23 03:21:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:21:56 localhost podman[82751]: 2026-02-23 08:21:56.937245674 +0000 UTC m=+0.099188332 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=collectd, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:21:56 localhost podman[82751]: 2026-02-23 08:21:56.949815765 +0000 UTC m=+0.111758403 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:21:56 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:21:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:21:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:21:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:21:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:21:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:21:58 localhost podman[82773]: 2026-02-23 08:21:58.944912958 +0000 UTC m=+0.101784223 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:21:59 localhost systemd[1]: tmp-crun.7jXv00.mount: Deactivated successfully. Feb 23 03:21:59 localhost podman[82776]: 2026-02-23 08:21:59.003939169 +0000 UTC m=+0.152126398 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=) Feb 23 03:21:59 localhost podman[82776]: 2026-02-23 08:21:59.041274882 +0000 UTC m=+0.189462051 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Feb 23 03:21:59 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:21:59 localhost podman[82782]: 2026-02-23 08:21:59.060852493 +0000 UTC m=+0.202126175 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, release=1766032510, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T23:32:04Z) Feb 23 03:21:59 localhost podman[82782]: 2026-02-23 08:21:59.098528006 +0000 UTC m=+0.239801738 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, architecture=x86_64, vcs-type=git, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T23:32:04Z) Feb 23 03:21:59 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:21:59 localhost podman[82774]: 2026-02-23 08:21:59.121375986 +0000 UTC m=+0.276925786 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:47Z) Feb 23 03:21:59 localhost podman[82775]: 2026-02-23 08:21:59.162281765 +0000 UTC m=+0.314839863 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:21:59 localhost podman[82773]: 2026-02-23 08:21:59.18344789 +0000 UTC m=+0.340319165 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:21:59 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:21:59 localhost podman[82775]: 2026-02-23 08:21:59.219308703 +0000 UTC m=+0.371866831 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, distribution-scope=public, managed_by=tripleo_ansible) Feb 23 03:21:59 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:21:59 localhost podman[82774]: 2026-02-23 08:21:59.233730144 +0000 UTC m=+0.389279924 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 23 03:21:59 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:22:01 localhost systemd[1]: session-34.scope: Deactivated successfully. Feb 23 03:22:01 localhost systemd[1]: session-34.scope: Consumed 20.336s CPU time. Feb 23 03:22:01 localhost systemd-logind[759]: Session 34 logged out. Waiting for processes to exit. Feb 23 03:22:01 localhost systemd-logind[759]: Removed session 34. Feb 23 03:22:04 localhost sshd[82887]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:22:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:22:05 localhost podman[82889]: 2026-02-23 08:22:05.06798026 +0000 UTC m=+0.061838899 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:22:05 localhost podman[82889]: 2026-02-23 08:22:05.588919185 +0000 UTC m=+0.582777854 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git) Feb 23 03:22:05 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:22:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:22:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:22:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:22:06 localhost podman[82912]: 2026-02-23 08:22:06.900974426 +0000 UTC m=+0.078062569 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Feb 23 03:22:06 localhost podman[82913]: 2026-02-23 08:22:06.957252713 +0000 UTC m=+0.131480629 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team) Feb 23 03:22:07 localhost podman[82914]: 2026-02-23 08:22:07.017545669 +0000 UTC m=+0.188860146 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:22:07 localhost podman[82912]: 2026-02-23 08:22:07.023560563 +0000 UTC m=+0.200648746 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, container_name=ovn_controller, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:36:40Z) Feb 23 03:22:07 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:22:07 localhost podman[82913]: 2026-02-23 08:22:07.038326693 +0000 UTC m=+0.212554599 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, version=17.1.13, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Feb 23 03:22:07 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:22:07 localhost podman[82914]: 2026-02-23 08:22:07.246283545 +0000 UTC m=+0.417598022 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, tcib_managed=true, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:22:07 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:22:07 localhost systemd[1]: tmp-crun.0y7OLF.mount: Deactivated successfully. Feb 23 03:22:13 localhost sshd[82990]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:22:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:22:27 localhost podman[83037]: 2026-02-23 08:22:27.911003188 +0000 UTC m=+0.082654774 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, container_name=collectd, url=https://www.redhat.com) Feb 23 03:22:27 localhost podman[83037]: 2026-02-23 08:22:27.923156558 +0000 UTC m=+0.094808144 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, release=1766032510) Feb 23 03:22:27 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:22:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:22:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:22:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:22:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:22:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:22:29 localhost podman[83058]: 2026-02-23 08:22:29.928139021 +0000 UTC m=+0.096998384 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 23 03:22:29 localhost systemd[1]: tmp-crun.Rg73MD.mount: Deactivated successfully. Feb 23 03:22:29 localhost podman[83057]: 2026-02-23 08:22:29.97638694 +0000 UTC m=+0.145777347 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, container_name=iscsid, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:22:29 localhost podman[83058]: 2026-02-23 08:22:29.986278058 +0000 UTC m=+0.155137351 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team) Feb 23 03:22:29 localhost podman[83071]: 2026-02-23 08:22:29.942937802 +0000 UTC m=+0.098594176 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, release=1766032510, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Feb 23 03:22:29 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:22:29 localhost podman[83057]: 2026-02-23 08:22:29.99925077 +0000 UTC m=+0.168641187 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:22:30 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:22:30 localhost podman[83071]: 2026-02-23 08:22:30.023977661 +0000 UTC m=+0.179634015 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:22:30 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:22:30 localhost podman[83060]: 2026-02-23 08:22:30.09434852 +0000 UTC m=+0.253870019 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:22:30 localhost podman[83060]: 2026-02-23 08:22:30.107184899 +0000 UTC m=+0.266706368 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Feb 23 03:22:30 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:22:30 localhost podman[83059]: 2026-02-23 08:22:30.004387399 +0000 UTC m=+0.166278312 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510) Feb 23 03:22:30 localhost podman[83059]: 2026-02-23 08:22:30.19533144 +0000 UTC m=+0.357222383 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:22:30 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:22:30 localhost systemd[1]: tmp-crun.lRKrUR.mount: Deactivated successfully. Feb 23 03:22:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:22:35 localhost systemd[1]: tmp-crun.9kG0e1.mount: Deactivated successfully. Feb 23 03:22:35 localhost podman[83173]: 2026-02-23 08:22:35.929464959 +0000 UTC m=+0.102168702 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target) Feb 23 03:22:36 localhost podman[83173]: 2026-02-23 08:22:36.287006711 +0000 UTC m=+0.459710444 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 23 03:22:36 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:22:37 localhost podman[83197]: 2026-02-23 08:22:37.914181663 +0000 UTC m=+0.085476731 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:22:37 localhost podman[83197]: 2026-02-23 08:22:37.945710008 +0000 UTC m=+0.117005126 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.openshift.expose-services=) Feb 23 03:22:37 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:22:37 localhost podman[83199]: 2026-02-23 08:22:37.938249335 +0000 UTC m=+0.102452910 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:22:38 localhost podman[83198]: 2026-02-23 08:22:38.019051798 +0000 UTC m=+0.186817760 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Feb 23 03:22:38 localhost podman[83198]: 2026-02-23 08:22:38.065373345 +0000 UTC m=+0.233139277 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, container_name=ovn_metadata_agent, version=17.1.13, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git) Feb 23 03:22:38 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:22:38 localhost podman[83199]: 2026-02-23 08:22:38.113328836 +0000 UTC m=+0.277532381 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:22:38 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:22:43 localhost sshd[83273]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:22:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:22:59 localhost systemd[1]: tmp-crun.Wl1Wew.mount: Deactivated successfully. Feb 23 03:22:59 localhost podman[83351]: 2026-02-23 08:22:59.019790598 +0000 UTC m=+0.099596292 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, architecture=x86_64) Feb 23 03:22:59 localhost podman[83351]: 2026-02-23 08:22:59.057417431 +0000 UTC m=+0.137223085 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible) Feb 23 03:22:59 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:23:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:23:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:23:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:23:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:23:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:23:00 localhost podman[83373]: 2026-02-23 08:23:00.950697298 +0000 UTC m=+0.110622844 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, release=1766032510, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 23 03:23:00 localhost podman[83374]: 2026-02-23 08:23:00.9972465 +0000 UTC m=+0.154217060 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:23:01 localhost podman[83372]: 2026-02-23 08:23:01.046995694 +0000 UTC m=+0.210001279 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=iscsid, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13) Feb 23 03:23:01 localhost podman[83372]: 2026-02-23 08:23:01.058324714 +0000 UTC m=+0.221330329 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:23:01 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:23:01 localhost podman[83381]: 2026-02-23 08:23:01.10894533 +0000 UTC m=+0.259486395 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:23:01 localhost podman[83374]: 2026-02-23 08:23:01.112547193 +0000 UTC m=+0.269517753 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13) Feb 23 03:23:01 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:23:01 localhost podman[83373]: 2026-02-23 08:23:01.163645201 +0000 UTC m=+0.323570747 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com) Feb 23 03:23:01 localhost podman[83381]: 2026-02-23 08:23:01.172019835 +0000 UTC m=+0.322560910 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-type=git, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:23:01 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:23:01 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:23:01 localhost podman[83375]: 2026-02-23 08:23:01.16793515 +0000 UTC m=+0.307357010 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5) Feb 23 03:23:01 localhost podman[83375]: 2026-02-23 08:23:01.248467082 +0000 UTC m=+0.387888902 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 23 03:23:01 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:23:03 localhost sshd[83580]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:23:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:23:06 localhost podman[83622]: 2026-02-23 08:23:06.903259652 +0000 UTC m=+0.079327503 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:23:07 localhost podman[83622]: 2026-02-23 08:23:07.32267654 +0000 UTC m=+0.498744461 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, release=1766032510) Feb 23 03:23:07 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:23:08 localhost podman[83872]: 2026-02-23 08:23:08.911099661 +0000 UTC m=+0.077190347 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1) Feb 23 03:23:08 localhost podman[83873]: 2026-02-23 08:23:08.970286647 +0000 UTC m=+0.133533290 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, distribution-scope=public) Feb 23 03:23:08 localhost podman[83872]: 2026-02-23 08:23:08.980266072 +0000 UTC m=+0.146356698 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 23 03:23:08 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:23:09 localhost podman[83871]: 2026-02-23 08:23:09.030007926 +0000 UTC m=+0.196403230 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:23:09 localhost podman[83871]: 2026-02-23 08:23:09.078345903 +0000 UTC m=+0.244741227 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, architecture=x86_64, container_name=ovn_controller, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:23:09 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:23:09 localhost podman[83873]: 2026-02-23 08:23:09.153172279 +0000 UTC m=+0.316418902 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 23 03:23:09 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:23:09 localhost systemd-logind[759]: Existing logind session ID 28 used by new audit session, ignoring. Feb 23 03:23:09 localhost systemd[1]: Created slice User Slice of UID 0. Feb 23 03:23:09 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 23 03:23:09 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 23 03:23:09 localhost systemd[1]: Starting User Manager for UID 0... Feb 23 03:23:09 localhost systemd[83969]: Queued start job for default target Main User Target. Feb 23 03:23:09 localhost systemd[83969]: Created slice User Application Slice. Feb 23 03:23:09 localhost systemd[83969]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 23 03:23:09 localhost systemd[83969]: Started Daily Cleanup of User's Temporary Directories. Feb 23 03:23:09 localhost systemd[83969]: Reached target Paths. Feb 23 03:23:09 localhost systemd[83969]: Reached target Timers. Feb 23 03:23:09 localhost systemd[83969]: Starting D-Bus User Message Bus Socket... Feb 23 03:23:09 localhost systemd[83969]: Starting Create User's Volatile Files and Directories... Feb 23 03:23:09 localhost systemd[83969]: Listening on D-Bus User Message Bus Socket. Feb 23 03:23:09 localhost systemd[83969]: Finished Create User's Volatile Files and Directories. Feb 23 03:23:09 localhost systemd[83969]: Reached target Sockets. Feb 23 03:23:09 localhost systemd[83969]: Reached target Basic System. Feb 23 03:23:09 localhost systemd[1]: Started User Manager for UID 0. Feb 23 03:23:09 localhost systemd[83969]: Reached target Main User Target. Feb 23 03:23:09 localhost systemd[83969]: Startup finished in 165ms. Feb 23 03:23:09 localhost systemd[1]: Started Session c11 of User root. Feb 23 03:23:09 localhost systemd[1]: tmp-crun.1AQc6g.mount: Deactivated successfully. Feb 23 03:23:10 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Feb 23 03:23:10 localhost kernel: device tapa27e5011-20 entered promiscuous mode Feb 23 03:23:10 localhost NetworkManager[5974]: [1771834990.8260] manager: (tapa27e5011-20): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Feb 23 03:23:10 localhost systemd-udevd[84004]: Network interface NamePolicy= disabled on kernel command line. Feb 23 03:23:10 localhost NetworkManager[5974]: [1771834990.8493] device (tapa27e5011-20): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Feb 23 03:23:10 localhost NetworkManager[5974]: [1771834990.8535] device (tapa27e5011-20): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Feb 23 03:23:10 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 23 03:23:10 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Feb 23 03:23:10 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Feb 23 03:23:10 localhost systemd-machined[84014]: New machine qemu-1-instance-00000003. Feb 23 03:23:10 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000003. Feb 23 03:23:11 localhost NetworkManager[5974]: [1771834991.1179] manager: (tap9da5b53d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Feb 23 03:23:11 localhost systemd-udevd[84003]: Network interface NamePolicy= disabled on kernel command line. Feb 23 03:23:11 localhost NetworkManager[5974]: [1771834991.1767] device (tap9da5b53d-30): carrier: link connected Feb 23 03:23:11 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9da5b53d-31: link becomes ready Feb 23 03:23:11 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9da5b53d-30: link becomes ready Feb 23 03:23:11 localhost kernel: device tap9da5b53d-30 entered promiscuous mode Feb 23 03:23:12 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 23 03:23:12 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 23 03:23:13 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Feb 23 03:23:13 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Feb 23 03:23:13 localhost podman[84151]: 2026-02-23 08:23:13.828582553 +0000 UTC m=+0.104585439 container create f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 23 03:23:13 localhost podman[84151]: 2026-02-23 08:23:13.778388248 +0000 UTC m=+0.054391174 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Feb 23 03:23:13 localhost systemd[1]: Started libpod-conmon-f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b.scope. Feb 23 03:23:13 localhost systemd[1]: tmp-crun.AYFn2b.mount: Deactivated successfully. Feb 23 03:23:13 localhost systemd[1]: Started libcrun container. Feb 23 03:23:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3b968a22d6dac5274c225974669ffbf9fd10e196a31be0e89003b3aedfce825/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 03:23:13 localhost podman[84151]: 2026-02-23 08:23:13.949288874 +0000 UTC m=+0.225291760 container init f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13) Feb 23 03:23:13 localhost podman[84151]: 2026-02-23 08:23:13.959536067 +0000 UTC m=+0.235538953 container start f1c94b6d873a5ea7f04a27d272093dc9c80dc0bedf3bc6f8302f2ec7ba926e2b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:23:14 localhost setroubleshoot[84108]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l a176daef-12e3-44f0-9641-bedf749d0981 Feb 23 03:23:14 localhost setroubleshoot[84108]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Feb 23 03:23:21 localhost snmpd[67690]: empty variable list in _query Feb 23 03:23:21 localhost snmpd[67690]: empty variable list in _query Feb 23 03:23:23 localhost sshd[84177]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:23:23 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Feb 23 03:23:23 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:23:23 localhost recover_tripleo_nova_virtqemud[84180]: 61982 Feb 23 03:23:23 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:23:23 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:23:24 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 23 03:23:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:23:29 localhost podman[84228]: 2026-02-23 08:23:29.925478903 +0000 UTC m=+0.095642700 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64) Feb 23 03:23:29 localhost podman[84228]: 2026-02-23 08:23:29.96640764 +0000 UTC m=+0.136571467 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, container_name=collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:23:29 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:23:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:23:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:23:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:23:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:23:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:23:31 localhost systemd[1]: tmp-crun.Mlj77o.mount: Deactivated successfully. Feb 23 03:23:31 localhost podman[84251]: 2026-02-23 08:23:31.929752462 +0000 UTC m=+0.096524842 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, release=1766032510, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:23:32 localhost podman[84251]: 2026-02-23 08:23:32.215289033 +0000 UTC m=+0.382061424 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 23 03:23:32 localhost podman[84253]: 2026-02-23 08:23:32.218007743 +0000 UTC m=+0.379511828 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:23:32 localhost podman[84250]: 2026-02-23 08:23:32.223784431 +0000 UTC m=+0.392148792 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z) Feb 23 03:23:32 localhost podman[84253]: 2026-02-23 08:23:32.246996846 +0000 UTC m=+0.408500911 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_compute) Feb 23 03:23:32 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:23:32 localhost podman[84250]: 2026-02-23 08:23:32.285240325 +0000 UTC m=+0.453604666 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 23 03:23:32 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:23:32 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:23:32 localhost podman[84252]: 2026-02-23 08:23:32.219954603 +0000 UTC m=+0.383389808 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z) Feb 23 03:23:32 localhost podman[84249]: 2026-02-23 08:23:32.220285762 +0000 UTC m=+0.390002197 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:23:32 localhost podman[84249]: 2026-02-23 08:23:32.354223451 +0000 UTC m=+0.523939866 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:23:32 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:23:32 localhost podman[84252]: 2026-02-23 08:23:32.403313478 +0000 UTC m=+0.566748763 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64) Feb 23 03:23:32 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40768 [23/Feb/2026:08:23:29.582] listener listener/metadata 0/0/0/4559/4559 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40784 [23/Feb/2026:08:23:34.244] listener listener/metadata 0/0/0/16/16 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40786 [23/Feb/2026:08:23:34.299] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40798 [23/Feb/2026:08:23:34.349] listener listener/metadata 0/0/0/11/11 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40812 [23/Feb/2026:08:23:34.400] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40828 [23/Feb/2026:08:23:34.448] listener listener/metadata 0/0/0/11/11 200 132 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40832 [23/Feb/2026:08:23:34.496] listener listener/metadata 0/0/0/11/11 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40836 [23/Feb/2026:08:23:34.543] listener listener/metadata 0/0/0/9/9 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40846 [23/Feb/2026:08:23:34.589] listener listener/metadata 0/0/0/9/9 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40850 [23/Feb/2026:08:23:34.636] listener listener/metadata 0/0/0/10/10 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40864 [23/Feb/2026:08:23:34.685] listener listener/metadata 0/0/0/10/10 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40872 [23/Feb/2026:08:23:34.726] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40888 [23/Feb/2026:08:23:34.767] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40900 [23/Feb/2026:08:23:34.807] listener listener/metadata 0/0/0/9/9 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40906 [23/Feb/2026:08:23:34.856] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Feb 23 03:23:34 localhost haproxy-metadata-proxy-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d[84172]: 192.168.0.12:40918 [23/Feb/2026:08:23:34.907] listener listener/metadata 0/0/0/10/10 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Feb 23 03:23:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:23:37 localhost systemd[1]: tmp-crun.aVNruL.mount: Deactivated successfully. Feb 23 03:23:37 localhost podman[84366]: 2026-02-23 08:23:37.929238639 +0000 UTC m=+0.091068074 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-type=git) Feb 23 03:23:38 localhost podman[84366]: 2026-02-23 08:23:38.313324123 +0000 UTC m=+0.475153568 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510) Feb 23 03:23:38 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:23:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:23:39 localhost systemd[1]: tmp-crun.wpvKUk.mount: Deactivated successfully. Feb 23 03:23:39 localhost podman[84389]: 2026-02-23 08:23:39.921307255 +0000 UTC m=+0.092132781 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ovn_controller, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:23:39 localhost podman[84390]: 2026-02-23 08:23:39.963356141 +0000 UTC m=+0.131339654 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, tcib_managed=true, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z) Feb 23 03:23:40 localhost podman[84390]: 2026-02-23 08:23:40.007097942 +0000 UTC m=+0.175081455 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13) Feb 23 03:23:40 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:23:40 localhost podman[84391]: 2026-02-23 08:23:40.021863139 +0000 UTC m=+0.187612824 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 23 03:23:40 localhost podman[84389]: 2026-02-23 08:23:40.041859941 +0000 UTC m=+0.212685477 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64) Feb 23 03:23:40 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:23:40 localhost podman[84391]: 2026-02-23 08:23:40.184206846 +0000 UTC m=+0.349956511 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1) Feb 23 03:23:40 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:23:46 localhost sshd[84464]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:24:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:24:00 localhost podman[84543]: 2026-02-23 08:24:00.920965878 +0000 UTC m=+0.091761022 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, distribution-scope=public) Feb 23 03:24:00 localhost podman[84543]: 2026-02-23 08:24:00.931990546 +0000 UTC m=+0.102785660 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team) Feb 23 03:24:00 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:24:01 localhost sshd[84563]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:24:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:24:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:24:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:24:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:24:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:24:02 localhost systemd[1]: tmp-crun.mPJ1Lx.mount: Deactivated successfully. Feb 23 03:24:02 localhost podman[84567]: 2026-02-23 08:24:02.936091692 +0000 UTC m=+0.098522480 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1) Feb 23 03:24:02 localhost podman[84566]: 2026-02-23 08:24:02.982549785 +0000 UTC m=+0.147077237 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:24:03 localhost podman[84565]: 2026-02-23 08:24:03.027607685 +0000 UTC m=+0.192957513 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, distribution-scope=public, version=17.1.13, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Feb 23 03:24:03 localhost podman[84566]: 2026-02-23 08:24:03.042227773 +0000 UTC m=+0.206755195 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, distribution-scope=public, tcib_managed=true, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Feb 23 03:24:03 localhost podman[84567]: 2026-02-23 08:24:03.050968422 +0000 UTC m=+0.213399250 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible) Feb 23 03:24:03 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:24:03 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:24:03 localhost podman[84568]: 2026-02-23 08:24:03.092620067 +0000 UTC m=+0.248838824 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:24:03 localhost podman[84568]: 2026-02-23 08:24:03.104250974 +0000 UTC m=+0.260469721 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, vcs-type=git) Feb 23 03:24:03 localhost podman[84565]: 2026-02-23 08:24:03.113624621 +0000 UTC m=+0.278974479 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:24:03 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:24:03 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:24:03 localhost podman[84569]: 2026-02-23 08:24:03.198587654 +0000 UTC m=+0.351746928 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Feb 23 03:24:03 localhost podman[84569]: 2026-02-23 08:24:03.229967976 +0000 UTC m=+0.383127250 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:24:03 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:24:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:24:08 localhost podman[84677]: 2026-02-23 08:24:08.918195183 +0000 UTC m=+0.091006579 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, url=https://www.redhat.com) Feb 23 03:24:09 localhost podman[84677]: 2026-02-23 08:24:09.30646847 +0000 UTC m=+0.479279866 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, release=1766032510, managed_by=tripleo_ansible) Feb 23 03:24:09 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:24:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:24:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:24:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:24:10 localhost systemd[1]: tmp-crun.zpSC71.mount: Deactivated successfully. Feb 23 03:24:10 localhost podman[84700]: 2026-02-23 08:24:10.93176851 +0000 UTC m=+0.100404038 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 23 03:24:10 localhost podman[84699]: 2026-02-23 08:24:10.980596975 +0000 UTC m=+0.152094181 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc.) Feb 23 03:24:11 localhost podman[84698]: 2026-02-23 08:24:11.029426092 +0000 UTC m=+0.204407414 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Feb 23 03:24:11 localhost podman[84698]: 2026-02-23 08:24:11.060395371 +0000 UTC m=+0.235376733 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1) Feb 23 03:24:11 localhost podman[84699]: 2026-02-23 08:24:11.067527509 +0000 UTC m=+0.239024695 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Feb 23 03:24:11 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:24:11 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:24:11 localhost podman[84700]: 2026-02-23 08:24:11.154368869 +0000 UTC m=+0.323004357 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=metrics_qdr, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:24:11 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:24:11 localhost systemd[1]: tmp-crun.FWIFAv.mount: Deactivated successfully. Feb 23 03:24:31 localhost sshd[84817]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:24:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:24:31 localhost systemd[1]: tmp-crun.X4sZI7.mount: Deactivated successfully. Feb 23 03:24:31 localhost podman[84819]: 2026-02-23 08:24:31.924495444 +0000 UTC m=+0.095320391 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp-rhel9/openstack-collectd) Feb 23 03:24:31 localhost podman[84819]: 2026-02-23 08:24:31.932730257 +0000 UTC m=+0.103555224 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, container_name=collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3) Feb 23 03:24:31 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:24:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:24:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:24:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:24:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:24:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:24:33 localhost systemd[1]: tmp-crun.AWbBme.mount: Deactivated successfully. Feb 23 03:24:33 localhost podman[84841]: 2026-02-23 08:24:33.938312188 +0000 UTC m=+0.101501531 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1766032510, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Feb 23 03:24:33 localhost podman[84842]: 2026-02-23 08:24:33.97655212 +0000 UTC m=+0.136859135 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git) Feb 23 03:24:34 localhost podman[84841]: 2026-02-23 08:24:34.021337342 +0000 UTC m=+0.184526645 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:24:34 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:24:34 localhost podman[84844]: 2026-02-23 08:24:34.027321415 +0000 UTC m=+0.183809213 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:24:34 localhost podman[84839]: 2026-02-23 08:24:34.092931655 +0000 UTC m=+0.260294705 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13) Feb 23 03:24:34 localhost podman[84839]: 2026-02-23 08:24:34.106407808 +0000 UTC m=+0.273770868 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container) Feb 23 03:24:34 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:24:34 localhost podman[84840]: 2026-02-23 08:24:33.956035932 +0000 UTC m=+0.121917337 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Feb 23 03:24:34 localhost podman[84842]: 2026-02-23 08:24:34.162299791 +0000 UTC m=+0.322606816 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T22:10:15Z) Feb 23 03:24:34 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:24:34 localhost podman[84840]: 2026-02-23 08:24:34.192444645 +0000 UTC m=+0.358326040 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:24:34 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:24:34 localhost podman[84844]: 2026-02-23 08:24:34.214196512 +0000 UTC m=+0.370684340 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=nova_compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:24:34 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:24:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:24:39 localhost podman[84953]: 2026-02-23 08:24:39.909373661 +0000 UTC m=+0.083589693 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:24:40 localhost sshd[84974]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:24:40 localhost podman[84953]: 2026-02-23 08:24:40.288750074 +0000 UTC m=+0.462966106 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 03:24:40 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:24:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:24:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:24:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:24:41 localhost sshd[84977]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:24:41 localhost systemd[1]: tmp-crun.LdO701.mount: Deactivated successfully. Feb 23 03:24:41 localhost podman[84979]: 2026-02-23 08:24:41.908698129 +0000 UTC m=+0.080189257 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:24:41 localhost podman[84978]: 2026-02-23 08:24:41.929246569 +0000 UTC m=+0.101021186 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:24:42 localhost podman[84976]: 2026-02-23 08:24:42.013616074 +0000 UTC m=+0.188138775 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z) Feb 23 03:24:42 localhost podman[84976]: 2026-02-23 08:24:42.035592868 +0000 UTC m=+0.210115589 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13) Feb 23 03:24:42 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:24:42 localhost podman[84978]: 2026-02-23 08:24:42.092556733 +0000 UTC m=+0.264331410 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z) Feb 23 03:24:42 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:24:42 localhost podman[84979]: 2026-02-23 08:24:42.151311713 +0000 UTC m=+0.322802821 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:24:42 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:24:42 localhost systemd[1]: tmp-crun.6aubgE.mount: Deactivated successfully. Feb 23 03:25:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:25:02 localhost systemd[1]: tmp-crun.EeqVq1.mount: Deactivated successfully. Feb 23 03:25:02 localhost podman[85134]: 2026-02-23 08:25:02.936376231 +0000 UTC m=+0.106693343 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com) Feb 23 03:25:02 localhost podman[85134]: 2026-02-23 08:25:02.949276885 +0000 UTC m=+0.119594027 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:25:02 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:25:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:25:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:25:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:25:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:25:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:25:04 localhost podman[85158]: 2026-02-23 08:25:04.932948768 +0000 UTC m=+0.087572617 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:25:04 localhost podman[85157]: 2026-02-23 08:25:04.914814334 +0000 UTC m=+0.078557221 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, com.redhat.component=openstack-cron-container, vcs-type=git, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:25:04 localhost podman[85155]: 2026-02-23 08:25:04.991629511 +0000 UTC m=+0.158844605 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Feb 23 03:25:05 localhost podman[85157]: 2026-02-23 08:25:05.00043252 +0000 UTC m=+0.164175377 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, container_name=logrotate_crond, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:25:05 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:25:05 localhost podman[85156]: 2026-02-23 08:25:05.025040763 +0000 UTC m=+0.189765191 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, distribution-scope=public) Feb 23 03:25:05 localhost podman[85155]: 2026-02-23 08:25:05.073443982 +0000 UTC m=+0.240659076 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Feb 23 03:25:05 localhost podman[85156]: 2026-02-23 08:25:05.081546529 +0000 UTC m=+0.246270997 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:25:05 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:25:05 localhost podman[85158]: 2026-02-23 08:25:05.09397874 +0000 UTC m=+0.248602629 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_id=tripleo_step5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:25:05 localhost podman[85154]: 2026-02-23 08:25:05.093089172 +0000 UTC m=+0.260124210 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z) Feb 23 03:25:05 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:25:05 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:25:05 localhost podman[85154]: 2026-02-23 08:25:05.17349888 +0000 UTC m=+0.340533958 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, container_name=iscsid, config_id=tripleo_step3) Feb 23 03:25:05 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:25:05 localhost systemd[1]: tmp-crun.eW6XgF.mount: Deactivated successfully. Feb 23 03:25:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:25:05 localhost recover_tripleo_nova_virtqemud[85270]: 61982 Feb 23 03:25:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:25:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:25:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:25:10 localhost systemd[1]: tmp-crun.KY5cJo.mount: Deactivated successfully. Feb 23 03:25:10 localhost podman[85271]: 2026-02-23 08:25:10.917093253 +0000 UTC m=+0.088644069 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:25:11 localhost podman[85271]: 2026-02-23 08:25:11.300321365 +0000 UTC m=+0.471872241 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:25:11 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:25:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:25:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:25:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:25:12 localhost systemd[1]: tmp-crun.xVzjyM.mount: Deactivated successfully. Feb 23 03:25:12 localhost podman[85296]: 2026-02-23 08:25:12.930485185 +0000 UTC m=+0.091447256 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64) Feb 23 03:25:12 localhost podman[85295]: 2026-02-23 08:25:12.976272294 +0000 UTC m=+0.139310378 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 23 03:25:13 localhost podman[85295]: 2026-02-23 08:25:13.025341864 +0000 UTC m=+0.188379958 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:25:13 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:25:13 localhost podman[85294]: 2026-02-23 08:25:13.029521912 +0000 UTC m=+0.194754373 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, container_name=ovn_controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public) Feb 23 03:25:13 localhost podman[85294]: 2026-02-23 08:25:13.113760637 +0000 UTC m=+0.278993108 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, tcib_managed=true, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:25:13 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:25:13 localhost podman[85296]: 2026-02-23 08:25:13.136311146 +0000 UTC m=+0.297273227 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com) Feb 23 03:25:13 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:25:15 localhost sshd[85368]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:25:19 localhost sshd[85370]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:25:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:25:33 localhost podman[85418]: 2026-02-23 08:25:33.916150491 +0000 UTC m=+0.089145766 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1) Feb 23 03:25:33 localhost podman[85418]: 2026-02-23 08:25:33.932271584 +0000 UTC m=+0.105266859 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Feb 23 03:25:33 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:25:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:25:35 localhost systemd[1]: tmp-crun.hcSaya.mount: Deactivated successfully. Feb 23 03:25:35 localhost podman[85439]: 2026-02-23 08:25:35.931531365 +0000 UTC m=+0.098518123 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 23 03:25:35 localhost podman[85439]: 2026-02-23 08:25:35.963730418 +0000 UTC m=+0.130717136 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13) Feb 23 03:25:35 localhost systemd[1]: tmp-crun.uHWQP4.mount: Deactivated successfully. Feb 23 03:25:35 localhost podman[85448]: 2026-02-23 08:25:35.988777564 +0000 UTC m=+0.144443785 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_id=tripleo_step5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com) Feb 23 03:25:35 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:25:36 localhost podman[85448]: 2026-02-23 08:25:36.021269847 +0000 UTC m=+0.176936088 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Feb 23 03:25:36 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:25:36 localhost podman[85440]: 2026-02-23 08:25:36.045896989 +0000 UTC m=+0.210193925 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, version=17.1.13, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi) Feb 23 03:25:36 localhost podman[85440]: 2026-02-23 08:25:36.084506349 +0000 UTC m=+0.248803295 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:25:36 localhost podman[85438]: 2026-02-23 08:25:36.096849987 +0000 UTC m=+0.266394212 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, release=1766032510, config_id=tripleo_step3, container_name=iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 03:25:36 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:25:36 localhost podman[85441]: 2026-02-23 08:25:36.063708534 +0000 UTC m=+0.222946355 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 23 03:25:36 localhost podman[85438]: 2026-02-23 08:25:36.130266058 +0000 UTC m=+0.299810263 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, batch=17.1_20260112.1) Feb 23 03:25:36 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:25:36 localhost podman[85441]: 2026-02-23 08:25:36.147196366 +0000 UTC m=+0.306434147 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:25:36 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:25:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:25:41 localhost systemd[1]: tmp-crun.m3jrMO.mount: Deactivated successfully. Feb 23 03:25:41 localhost podman[85554]: 2026-02-23 08:25:41.921178797 +0000 UTC m=+0.096320745 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, container_name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4) Feb 23 03:25:42 localhost podman[85554]: 2026-02-23 08:25:42.358245825 +0000 UTC m=+0.533387753 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:25:42 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:25:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:25:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:25:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:25:43 localhost systemd[1]: tmp-crun.Hcmpjm.mount: Deactivated successfully. Feb 23 03:25:43 localhost podman[85577]: 2026-02-23 08:25:43.936784987 +0000 UTC m=+0.098031097 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64) Feb 23 03:25:43 localhost podman[85577]: 2026-02-23 08:25:43.963321138 +0000 UTC m=+0.124567278 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, distribution-scope=public, architecture=x86_64) Feb 23 03:25:43 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:25:43 localhost podman[85579]: 2026-02-23 08:25:43.984158715 +0000 UTC m=+0.140655350 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1766032510, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:25:44 localhost podman[85578]: 2026-02-23 08:25:44.033927186 +0000 UTC m=+0.192876126 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true) Feb 23 03:25:44 localhost podman[85578]: 2026-02-23 08:25:44.106313458 +0000 UTC m=+0.265262438 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:25:44 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:25:44 localhost podman[85579]: 2026-02-23 08:25:44.189368436 +0000 UTC m=+0.345865041 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 23 03:25:44 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:25:59 localhost sshd[85732]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:25:59 localhost sshd[85734]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:26:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:26:04 localhost podman[85736]: 2026-02-23 08:26:04.929514138 +0000 UTC m=+0.100764620 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, container_name=collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:26:04 localhost podman[85736]: 2026-02-23 08:26:04.945383946 +0000 UTC m=+0.116634438 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 23 03:26:04 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:26:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:26:06 localhost systemd[1]: tmp-crun.NUL0Xd.mount: Deactivated successfully. Feb 23 03:26:06 localhost podman[85756]: 2026-02-23 08:26:06.916546527 +0000 UTC m=+0.088963866 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:26:06 localhost systemd[1]: tmp-crun.o027uw.mount: Deactivated successfully. Feb 23 03:26:06 localhost podman[85759]: 2026-02-23 08:26:06.974786559 +0000 UTC m=+0.137503825 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, batch=17.1_20260112.1) Feb 23 03:26:06 localhost podman[85756]: 2026-02-23 08:26:06.981431298 +0000 UTC m=+0.153848587 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, tcib_managed=true, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:26:06 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:26:07 localhost podman[85758]: 2026-02-23 08:26:07.034651488 +0000 UTC m=+0.200127367 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, build-date=2026-01-12T23:07:30Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 23 03:26:07 localhost podman[85757]: 2026-02-23 08:26:06.935814187 +0000 UTC m=+0.102657078 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=ceilometer_agent_compute, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z) Feb 23 03:26:07 localhost podman[85765]: 2026-02-23 08:26:06.957692955 +0000 UTC m=+0.115386540 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public) Feb 23 03:26:07 localhost podman[85758]: 2026-02-23 08:26:07.067235828 +0000 UTC m=+0.232711657 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:26:07 localhost podman[85757]: 2026-02-23 08:26:07.067448644 +0000 UTC m=+0.234291545 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 03:26:07 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:26:07 localhost podman[85759]: 2026-02-23 08:26:07.087532778 +0000 UTC m=+0.250250034 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.) Feb 23 03:26:07 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:26:07 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:26:07 localhost podman[85765]: 2026-02-23 08:26:07.141966645 +0000 UTC m=+0.299660310 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 23 03:26:07 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:26:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:26:12 localhost podman[85873]: 2026-02-23 08:26:12.883578946 +0000 UTC m=+0.062794138 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:26:13 localhost podman[85873]: 2026-02-23 08:26:13.305366468 +0000 UTC m=+0.484581720 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:26:13 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:26:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:26:14 localhost systemd[1]: tmp-crun.idEAVE.mount: Deactivated successfully. Feb 23 03:26:14 localhost podman[85896]: 2026-02-23 08:26:14.92460945 +0000 UTC m=+0.097025618 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, distribution-scope=public, release=1766032510, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:26:14 localhost podman[85896]: 2026-02-23 08:26:14.95622541 +0000 UTC m=+0.128641558 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=) Feb 23 03:26:14 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:26:14 localhost podman[85898]: 2026-02-23 08:26:14.97650498 +0000 UTC m=+0.141473644 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1) Feb 23 03:26:15 localhost podman[85897]: 2026-02-23 08:26:15.025280687 +0000 UTC m=+0.193510009 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:26:15 localhost podman[85897]: 2026-02-23 08:26:15.07628891 +0000 UTC m=+0.244518282 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1) Feb 23 03:26:15 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:26:15 localhost podman[85898]: 2026-02-23 08:26:15.219477365 +0000 UTC m=+0.384446019 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:26:15 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:26:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:26:35 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:26:35 localhost recover_tripleo_nova_virtqemud[86018]: 61982 Feb 23 03:26:35 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:26:35 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:26:35 localhost podman[86015]: 2026-02-23 08:26:35.929185792 +0000 UTC m=+0.094962467 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13) Feb 23 03:26:35 localhost podman[86015]: 2026-02-23 08:26:35.947598045 +0000 UTC m=+0.113374710 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, tcib_managed=true, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5) Feb 23 03:26:35 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:26:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:26:37 localhost podman[86039]: 2026-02-23 08:26:37.882662822 +0000 UTC m=+0.055053287 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, distribution-scope=public) Feb 23 03:26:37 localhost podman[86038]: 2026-02-23 08:26:37.927695246 +0000 UTC m=+0.097012448 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13) Feb 23 03:26:37 localhost podman[86039]: 2026-02-23 08:26:37.951284876 +0000 UTC m=+0.123675381 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510) Feb 23 03:26:37 localhost podman[86037]: 2026-02-23 08:26:37.96906599 +0000 UTC m=+0.141299650 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, vcs-type=git, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:26:37 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:26:37 localhost podman[86041]: 2026-02-23 08:26:37.997993359 +0000 UTC m=+0.167777525 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:26:38 localhost podman[86040]: 2026-02-23 08:26:37.951076729 +0000 UTC m=+0.121604837 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:26:38 localhost podman[86038]: 2026-02-23 08:26:38.017092123 +0000 UTC m=+0.186409305 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container) Feb 23 03:26:38 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:26:38 localhost podman[86040]: 2026-02-23 08:26:38.035380644 +0000 UTC m=+0.205908802 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z) Feb 23 03:26:38 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:26:38 localhost podman[86037]: 2026-02-23 08:26:38.058654873 +0000 UTC m=+0.230888573 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com) Feb 23 03:26:38 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:26:38 localhost podman[86041]: 2026-02-23 08:26:38.076476279 +0000 UTC m=+0.246260465 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:26:38 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:26:38 localhost systemd[1]: tmp-crun.Rqs7lX.mount: Deactivated successfully. Feb 23 03:26:40 localhost sshd[86153]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:26:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:26:43 localhost podman[86155]: 2026-02-23 08:26:43.899166468 +0000 UTC m=+0.070387017 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 23 03:26:44 localhost sshd[86175]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:26:44 localhost podman[86155]: 2026-02-23 08:26:44.29237657 +0000 UTC m=+0.463597129 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com) Feb 23 03:26:44 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:26:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:26:45 localhost podman[86180]: 2026-02-23 08:26:45.913891401 +0000 UTC m=+0.080311416 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible) Feb 23 03:26:45 localhost systemd[1]: tmp-crun.2aanf6.mount: Deactivated successfully. Feb 23 03:26:45 localhost podman[86178]: 2026-02-23 08:26:45.977781722 +0000 UTC m=+0.151302390 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:26:46 localhost podman[86178]: 2026-02-23 08:26:46.028449055 +0000 UTC m=+0.201969743 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true) Feb 23 03:26:46 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:26:46 localhost podman[86179]: 2026-02-23 08:26:46.115504913 +0000 UTC m=+0.284661640 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., version=17.1.13) Feb 23 03:26:46 localhost podman[86180]: 2026-02-23 08:26:46.142326109 +0000 UTC m=+0.308746084 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:26:46 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:26:46 localhost podman[86179]: 2026-02-23 08:26:46.187649502 +0000 UTC m=+0.356806249 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Feb 23 03:26:46 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:26:46 localhost systemd[1]: tmp-crun.GRHFdc.mount: Deactivated successfully. Feb 23 03:27:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:27:06 localhost systemd[1]: tmp-crun.tVOK4k.mount: Deactivated successfully. Feb 23 03:27:06 localhost podman[86328]: 2026-02-23 08:27:06.936941513 +0000 UTC m=+0.107714793 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:27:06 localhost podman[86328]: 2026-02-23 08:27:06.948123346 +0000 UTC m=+0.118896656 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible) Feb 23 03:27:06 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:27:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:27:08 localhost systemd[1]: tmp-crun.J6cW6y.mount: Deactivated successfully. Feb 23 03:27:08 localhost podman[86350]: 2026-02-23 08:27:08.934290808 +0000 UTC m=+0.097243199 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1766032510, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, version=17.1.13, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true) Feb 23 03:27:08 localhost systemd[1]: tmp-crun.OK3cbs.mount: Deactivated successfully. Feb 23 03:27:09 localhost podman[86350]: 2026-02-23 08:27:09.013684369 +0000 UTC m=+0.176636820 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:27:09 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:27:09 localhost podman[86351]: 2026-02-23 08:27:08.994381319 +0000 UTC m=+0.152971949 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 23 03:27:09 localhost podman[86349]: 2026-02-23 08:27:09.016537089 +0000 UTC m=+0.181956250 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, version=17.1.13) Feb 23 03:27:09 localhost podman[86357]: 2026-02-23 08:27:09.023640957 +0000 UTC m=+0.180166989 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute) Feb 23 03:27:09 localhost podman[86351]: 2026-02-23 08:27:09.079208631 +0000 UTC m=+0.237799281 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:27:09 localhost podman[86348]: 2026-02-23 08:27:09.086739612 +0000 UTC m=+0.256221867 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z) Feb 23 03:27:09 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:27:09 localhost podman[86349]: 2026-02-23 08:27:09.095657881 +0000 UTC m=+0.261077062 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510) Feb 23 03:27:09 localhost podman[86357]: 2026-02-23 08:27:09.103856831 +0000 UTC m=+0.260382823 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:27:09 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:27:09 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:27:09 localhost podman[86348]: 2026-02-23 08:27:09.147319005 +0000 UTC m=+0.316801250 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:27:09 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:27:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:27:14 localhost podman[86469]: 2026-02-23 08:27:14.904919857 +0000 UTC m=+0.080486542 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 23 03:27:15 localhost podman[86469]: 2026-02-23 08:27:15.314202622 +0000 UTC m=+0.489769307 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:27:15 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:27:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:27:16 localhost systemd[1]: tmp-crun.TdFd3k.mount: Deactivated successfully. Feb 23 03:27:16 localhost podman[86493]: 2026-02-23 08:27:16.903377143 +0000 UTC m=+0.080046889 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public) Feb 23 03:27:16 localhost podman[86492]: 2026-02-23 08:27:16.965991173 +0000 UTC m=+0.144208443 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:27:17 localhost podman[86494]: 2026-02-23 08:27:17.012108684 +0000 UTC m=+0.185757476 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.buildah.version=1.41.5, container_name=metrics_qdr, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:10:14Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 23 03:27:17 localhost podman[86492]: 2026-02-23 08:27:17.01734954 +0000 UTC m=+0.195566820 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:27:17 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:27:17 localhost podman[86493]: 2026-02-23 08:27:17.035645781 +0000 UTC m=+0.212315517 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:27:17 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:27:17 localhost podman[86494]: 2026-02-23 08:27:17.230296745 +0000 UTC m=+0.403945597 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, version=17.1.13, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:27:17 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:27:17 localhost systemd[1]: tmp-crun.cDjKsh.mount: Deactivated successfully. Feb 23 03:27:24 localhost sshd[86568]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:27:34 localhost sshd[86615]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:27:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:27:37 localhost systemd[1]: tmp-crun.mjn1xo.mount: Deactivated successfully. Feb 23 03:27:37 localhost podman[86617]: 2026-02-23 08:27:37.918659059 +0000 UTC m=+0.090841031 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T22:10:15Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:27:37 localhost podman[86617]: 2026-02-23 08:27:37.935618494 +0000 UTC m=+0.107800476 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 23 03:27:37 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:27:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:27:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:27:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:27:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:27:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:27:40 localhost systemd[1]: tmp-crun.3DH3FI.mount: Deactivated successfully. Feb 23 03:27:40 localhost podman[86639]: 2026-02-23 08:27:40.342218952 +0000 UTC m=+0.218648225 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 23 03:27:40 localhost podman[86638]: 2026-02-23 08:27:40.364820384 +0000 UTC m=+0.239928860 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13) Feb 23 03:27:40 localhost podman[86637]: 2026-02-23 08:27:40.830140766 +0000 UTC m=+0.709677875 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:27:40 localhost podman[86643]: 2026-02-23 08:27:40.379417992 +0000 UTC m=+0.251167513 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true) Feb 23 03:27:40 localhost podman[86637]: 2026-02-23 08:27:40.838210312 +0000 UTC m=+0.717747391 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, container_name=iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 23 03:27:40 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:27:40 localhost podman[86638]: 2026-02-23 08:27:40.848659685 +0000 UTC m=+0.723768191 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 03:27:40 localhost podman[86639]: 2026-02-23 08:27:40.865020103 +0000 UTC m=+0.741449386 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:27:40 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:27:40 localhost podman[86643]: 2026-02-23 08:27:40.867214803 +0000 UTC m=+0.738964324 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public, release=1766032510, tcib_managed=true, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 23 03:27:40 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:27:40 localhost podman[86640]: 2026-02-23 08:27:40.431920401 +0000 UTC m=+0.302473449 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Feb 23 03:27:40 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:27:40 localhost podman[86640]: 2026-02-23 08:27:40.936374708 +0000 UTC m=+0.806927766 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5) Feb 23 03:27:40 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:27:41 localhost systemd[1]: tmp-crun.Dhd2Q6.mount: Deactivated successfully. Feb 23 03:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:27:45 localhost podman[86754]: 2026-02-23 08:27:45.918742408 +0000 UTC m=+0.087561981 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:27:46 localhost podman[86754]: 2026-02-23 08:27:46.307356064 +0000 UTC m=+0.476175557 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 03:27:46 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:27:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:27:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:27:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:27:47 localhost systemd[1]: tmp-crun.qGTSAo.mount: Deactivated successfully. Feb 23 03:27:47 localhost podman[86778]: 2026-02-23 08:27:47.917198284 +0000 UTC m=+0.089976067 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:27:47 localhost podman[86777]: 2026-02-23 08:27:47.962023847 +0000 UTC m=+0.137328991 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=) Feb 23 03:27:47 localhost podman[86778]: 2026-02-23 08:27:47.978308303 +0000 UTC m=+0.151086056 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=) Feb 23 03:27:47 localhost podman[86777]: 2026-02-23 08:27:47.990391231 +0000 UTC m=+0.165696365 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, architecture=x86_64) Feb 23 03:27:47 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:27:48 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:27:48 localhost podman[86779]: 2026-02-23 08:27:48.068026672 +0000 UTC m=+0.236903506 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, url=https://www.redhat.com) Feb 23 03:27:48 localhost podman[86779]: 2026-02-23 08:27:48.274261899 +0000 UTC m=+0.443138733 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, release=1766032510) Feb 23 03:27:48 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:28:07 localhost sshd[86930]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:28:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:28:08 localhost systemd[1]: tmp-crun.QRUx4A.mount: Deactivated successfully. Feb 23 03:28:08 localhost podman[86932]: 2026-02-23 08:28:08.935222043 +0000 UTC m=+0.108693977 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, vcs-type=git, container_name=collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 23 03:28:08 localhost podman[86932]: 2026-02-23 08:28:08.95111733 +0000 UTC m=+0.124589234 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, container_name=collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:28:08 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:28:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:28:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:28:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:28:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:28:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:28:11 localhost systemd[83969]: Created slice User Background Tasks Slice. Feb 23 03:28:11 localhost podman[86954]: 2026-02-23 08:28:11.948028535 +0000 UTC m=+0.102181461 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=) Feb 23 03:28:11 localhost systemd[83969]: Starting Cleanup of User's Temporary Files and Directories... Feb 23 03:28:11 localhost systemd[83969]: Finished Cleanup of User's Temporary Files and Directories. Feb 23 03:28:11 localhost podman[86954]: 2026-02-23 08:28:11.993298905 +0000 UTC m=+0.147451781 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=) Feb 23 03:28:11 localhost podman[86955]: 2026-02-23 08:28:11.993524151 +0000 UTC m=+0.146268165 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:28:12 localhost podman[86952]: 2026-02-23 08:28:12.09165011 +0000 UTC m=+0.257100876 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:28:12 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:28:12 localhost podman[86955]: 2026-02-23 08:28:12.121920709 +0000 UTC m=+0.274664653 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:28:12 localhost podman[86952]: 2026-02-23 08:28:12.131982962 +0000 UTC m=+0.297433728 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=) Feb 23 03:28:12 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:28:12 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:28:12 localhost podman[86961]: 2026-02-23 08:28:12.190751578 +0000 UTC m=+0.342499512 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:28:12 localhost podman[86953]: 2026-02-23 08:28:12.104923449 +0000 UTC m=+0.266131187 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com) Feb 23 03:28:12 localhost podman[86961]: 2026-02-23 08:28:12.222849062 +0000 UTC m=+0.374596996 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:28:12 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:28:12 localhost podman[86953]: 2026-02-23 08:28:12.240257654 +0000 UTC m=+0.401465362 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container) Feb 23 03:28:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:28:12 localhost systemd[1]: tmp-crun.i2Ju1f.mount: Deactivated successfully. Feb 23 03:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:28:16 localhost podman[87064]: 2026-02-23 08:28:16.900483756 +0000 UTC m=+0.073939252 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, container_name=nova_migration_target, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, batch=17.1_20260112.1) Feb 23 03:28:17 localhost podman[87064]: 2026-02-23 08:28:17.258612366 +0000 UTC m=+0.432067902 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:28:17 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:28:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:28:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:28:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:28:18 localhost podman[87090]: 2026-02-23 08:28:18.906823698 +0000 UTC m=+0.078053326 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1) Feb 23 03:28:18 localhost systemd[1]: tmp-crun.Bep8KR.mount: Deactivated successfully. Feb 23 03:28:18 localhost podman[87090]: 2026-02-23 08:28:18.970718178 +0000 UTC m=+0.141947836 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:28:18 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:28:19 localhost podman[87089]: 2026-02-23 08:28:18.976270655 +0000 UTC m=+0.149516363 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 03:28:19 localhost podman[87089]: 2026-02-23 08:28:19.056793804 +0000 UTC m=+0.230039522 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, release=1766032510, vcs-type=git) Feb 23 03:28:19 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:28:19 localhost podman[87091]: 2026-02-23 08:28:19.120669604 +0000 UTC m=+0.291922503 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, config_id=tripleo_step1, url=https://www.redhat.com) Feb 23 03:28:19 localhost podman[87091]: 2026-02-23 08:28:19.324478526 +0000 UTC m=+0.495731445 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:28:19 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:28:23 localhost sshd[87165]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:28:23 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:28:23 localhost recover_tripleo_nova_virtqemud[87168]: 61982 Feb 23 03:28:23 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:28:23 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:28:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:28:40 localhost systemd[1]: tmp-crun.7ZdWkc.mount: Deactivated successfully. Feb 23 03:28:40 localhost podman[87214]: 2026-02-23 08:28:40.787920023 +0000 UTC m=+0.058351025 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:28:40 localhost podman[87214]: 2026-02-23 08:28:40.800118419 +0000 UTC m=+0.070549431 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, config_id=tripleo_step3, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public) Feb 23 03:28:40 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:28:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:28:43 localhost podman[87235]: 2026-02-23 08:28:43.32293923 +0000 UTC m=+0.069610243 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 23 03:28:43 localhost podman[87235]: 2026-02-23 08:28:43.342022813 +0000 UTC m=+0.088693816 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team) Feb 23 03:28:43 localhost podman[87234]: 2026-02-23 08:28:43.303936578 +0000 UTC m=+0.056606611 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1) Feb 23 03:28:43 localhost podman[87236]: 2026-02-23 08:28:43.361834458 +0000 UTC m=+0.110617095 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:10:15Z) Feb 23 03:28:43 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:28:43 localhost podman[87236]: 2026-02-23 08:28:43.396201051 +0000 UTC m=+0.144983738 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:28:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:28:43 localhost podman[87234]: 2026-02-23 08:28:43.436397598 +0000 UTC m=+0.189067641 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:28:43 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:28:43 localhost podman[87233]: 2026-02-23 08:28:43.4700895 +0000 UTC m=+0.225055452 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:28:43 localhost podman[87233]: 2026-02-23 08:28:43.503892717 +0000 UTC m=+0.258858619 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, container_name=iscsid, vendor=Red Hat, Inc.) Feb 23 03:28:43 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:28:43 localhost podman[87242]: 2026-02-23 08:28:43.520087803 +0000 UTC m=+0.267308182 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:28:43 localhost podman[87242]: 2026-02-23 08:28:43.552160557 +0000 UTC m=+0.299380876 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute) Feb 23 03:28:43 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:28:44 localhost systemd[1]: tmp-crun.rCaPcr.mount: Deactivated successfully. Feb 23 03:28:47 localhost sshd[87343]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:28:47 localhost systemd[1]: tmp-crun.aTjdAp.mount: Deactivated successfully. Feb 23 03:28:47 localhost podman[87345]: 2026-02-23 08:28:47.831166333 +0000 UTC m=+0.090490559 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:28:48 localhost podman[87345]: 2026-02-23 08:28:48.222242464 +0000 UTC m=+0.481566720 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, version=17.1.13, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5) Feb 23 03:28:48 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:28:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:28:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:28:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:28:49 localhost systemd[1]: tmp-crun.JZSdKG.mount: Deactivated successfully. Feb 23 03:28:49 localhost podman[87368]: 2026-02-23 08:28:49.922377246 +0000 UTC m=+0.097539092 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:28:49 localhost podman[87368]: 2026-02-23 08:28:49.97644214 +0000 UTC m=+0.151603916 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:28:49 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:28:50 localhost podman[87369]: 2026-02-23 08:28:49.977276895 +0000 UTC m=+0.147644176 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:28:50 localhost podman[87369]: 2026-02-23 08:28:50.058704332 +0000 UTC m=+0.229071653 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:28:50 localhost podman[87370]: 2026-02-23 08:28:50.069690502 +0000 UTC m=+0.240088584 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 23 03:28:50 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:28:50 localhost podman[87370]: 2026-02-23 08:28:50.272473775 +0000 UTC m=+0.442871857 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, batch=17.1_20260112.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container) Feb 23 03:28:50 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:29:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:29:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 622 writes, 2372 keys, 622 commit groups, 1.0 writes per commit group, ingest: 2.93 MB, 0.00 MB/s#012Interval WAL: 622 writes, 215 syncs, 2.89 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:29:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:29:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 408 writes, 1665 keys, 408 commit groups, 1.0 writes per commit group, ingest: 2.08 MB, 0.00 MB/s#012Interval WAL: 408 writes, 144 syncs, 2.83 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:29:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:29:11 localhost podman[87521]: 2026-02-23 08:29:11.891490515 +0000 UTC m=+0.062310873 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64) Feb 23 03:29:11 localhost podman[87521]: 2026-02-23 08:29:11.900849956 +0000 UTC m=+0.071670284 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.expose-services=) Feb 23 03:29:11 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:29:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:29:13 localhost podman[87542]: 2026-02-23 08:29:13.925039477 +0000 UTC m=+0.098652956 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z) Feb 23 03:29:13 localhost podman[87541]: 2026-02-23 08:29:13.972442124 +0000 UTC m=+0.145525746 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1766032510, version=17.1.13) Feb 23 03:29:13 localhost podman[87542]: 2026-02-23 08:29:13.988262693 +0000 UTC m=+0.161876122 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64) Feb 23 03:29:13 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:29:14 localhost podman[87543]: 2026-02-23 08:29:14.027685577 +0000 UTC m=+0.199733657 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=) Feb 23 03:29:14 localhost podman[87541]: 2026-02-23 08:29:14.062341302 +0000 UTC m=+0.235424934 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:29:14 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:29:14 localhost podman[87545]: 2026-02-23 08:29:14.074129884 +0000 UTC m=+0.241472225 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:29:14 localhost podman[87543]: 2026-02-23 08:29:14.080363911 +0000 UTC m=+0.252412041 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:29:14 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:29:14 localhost podman[87544]: 2026-02-23 08:29:14.123459372 +0000 UTC m=+0.290895686 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.13, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron) Feb 23 03:29:14 localhost podman[87544]: 2026-02-23 08:29:14.133246101 +0000 UTC m=+0.300682455 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:29:14 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:29:14 localhost podman[87545]: 2026-02-23 08:29:14.155661679 +0000 UTC m=+0.323004000 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, release=1766032510, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:29:14 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:29:14 localhost sshd[87659]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:29:18 localhost podman[87661]: 2026-02-23 08:29:18.915044806 +0000 UTC m=+0.087039859 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:29:19 localhost podman[87661]: 2026-02-23 08:29:19.332323181 +0000 UTC m=+0.504318244 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step4) Feb 23 03:29:19 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:29:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:29:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:29:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:29:20 localhost systemd[1]: tmp-crun.X3irS5.mount: Deactivated successfully. Feb 23 03:29:20 localhost podman[87684]: 2026-02-23 08:29:20.957436463 +0000 UTC m=+0.134870929 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:29:21 localhost podman[87686]: 2026-02-23 08:29:21.015447645 +0000 UTC m=+0.186688115 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, version=17.1.13, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, release=1766032510, config_id=tripleo_step1, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.) Feb 23 03:29:21 localhost podman[87685]: 2026-02-23 08:29:20.981761252 +0000 UTC m=+0.155112169 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:56:19Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 23 03:29:21 localhost podman[87685]: 2026-02-23 08:29:21.067688144 +0000 UTC m=+0.241039021 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git) Feb 23 03:29:21 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:29:21 localhost podman[87684]: 2026-02-23 08:29:21.120205143 +0000 UTC m=+0.297639609 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:29:21 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:29:21 localhost podman[87686]: 2026-02-23 08:29:21.245356415 +0000 UTC m=+0.416596925 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:29:21 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:29:21 localhost systemd[1]: tmp-crun.YBg8Ml.mount: Deactivated successfully. Feb 23 03:29:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=51706 SEQ=0 ACK=1003347813 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:29:27 localhost sshd[87761]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:29:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:29:42 localhost systemd[1]: tmp-crun.3YIuMB.mount: Deactivated successfully. Feb 23 03:29:42 localhost podman[87810]: 2026-02-23 08:29:42.915917089 +0000 UTC m=+0.087111253 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, tcib_managed=true) Feb 23 03:29:42 localhost podman[87810]: 2026-02-23 08:29:42.955284282 +0000 UTC m=+0.126478436 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:29:42 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:29:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:29:45 localhost systemd[1]: tmp-crun.x7ZaTW.mount: Deactivated successfully. Feb 23 03:29:46 localhost podman[87831]: 2026-02-23 08:29:45.999964856 +0000 UTC m=+1.173453992 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public) Feb 23 03:29:46 localhost podman[87833]: 2026-02-23 08:29:46.050390688 +0000 UTC m=+1.221187340 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:29:46 localhost podman[87834]: 2026-02-23 08:29:46.090554797 +0000 UTC m=+1.259719707 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step5, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_compute) Feb 23 03:29:46 localhost podman[87830]: 2026-02-23 08:29:46.098129905 +0000 UTC m=+1.270165925 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:29:46 localhost podman[87830]: 2026-02-23 08:29:46.108371909 +0000 UTC m=+1.280407919 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64) Feb 23 03:29:46 localhost podman[87834]: 2026-02-23 08:29:46.116222567 +0000 UTC m=+1.285387477 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:29:46 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:29:46 localhost podman[87831]: 2026-02-23 08:29:46.124463587 +0000 UTC m=+1.297952723 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4) Feb 23 03:29:46 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:29:46 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:29:46 localhost podman[87833]: 2026-02-23 08:29:46.13628148 +0000 UTC m=+1.307078132 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:29:46 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:29:46 localhost podman[87832]: 2026-02-23 08:29:46.19011052 +0000 UTC m=+1.363552034 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git) Feb 23 03:29:46 localhost podman[87832]: 2026-02-23 08:29:46.219213079 +0000 UTC m=+1.392654613 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true) Feb 23 03:29:46 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:29:46 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:29:46 localhost recover_tripleo_nova_virtqemud[87950]: 61982 Feb 23 03:29:46 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:29:46 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:29:46 localhost systemd[1]: tmp-crun.prKG6S.mount: Deactivated successfully. Feb 23 03:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:29:49 localhost systemd[1]: tmp-crun.gGeak4.mount: Deactivated successfully. Feb 23 03:29:49 localhost podman[87951]: 2026-02-23 08:29:49.928068244 +0000 UTC m=+0.101451725 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:29:50 localhost podman[87951]: 2026-02-23 08:29:50.326302588 +0000 UTC m=+0.499686019 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1) Feb 23 03:29:50 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:29:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:29:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:29:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:29:51 localhost systemd[1]: tmp-crun.wuozqy.mount: Deactivated successfully. Feb 23 03:29:51 localhost podman[87975]: 2026-02-23 08:29:51.91596658 +0000 UTC m=+0.090163327 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, release=1766032510, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:29:51 localhost systemd[1]: tmp-crun.AWThvj.mount: Deactivated successfully. Feb 23 03:29:51 localhost podman[87976]: 2026-02-23 08:29:51.975205071 +0000 UTC m=+0.143956896 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64) Feb 23 03:29:52 localhost podman[87976]: 2026-02-23 08:29:52.01633114 +0000 UTC m=+0.185082965 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent) Feb 23 03:29:52 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:29:52 localhost podman[87977]: 2026-02-23 08:29:52.03216444 +0000 UTC m=+0.196541897 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, container_name=metrics_qdr, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:29:52 localhost podman[87975]: 2026-02-23 08:29:52.042990712 +0000 UTC m=+0.217187439 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller) Feb 23 03:29:52 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:29:52 localhost podman[87977]: 2026-02-23 08:29:52.232154444 +0000 UTC m=+0.396531931 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1) Feb 23 03:29:52 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:30:02 localhost sshd[88053]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:30:05 localhost sshd[88055]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:30:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:30:13 localhost podman[88184]: 2026-02-23 08:30:13.925342542 +0000 UTC m=+0.094602658 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_id=tripleo_step3, container_name=collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com) Feb 23 03:30:13 localhost podman[88184]: 2026-02-23 08:30:13.93828463 +0000 UTC m=+0.107544706 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:30:13 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:30:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=43100 SEQ=0 ACK=564930235 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:30:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:30:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:30:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:30:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:30:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:30:18 localhost systemd[1]: tmp-crun.inRsQQ.mount: Deactivated successfully. Feb 23 03:30:18 localhost podman[88208]: 2026-02-23 08:30:18.037176902 +0000 UTC m=+0.071780278 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:30:18 localhost podman[88209]: 2026-02-23 08:30:18.068260223 +0000 UTC m=+0.101357822 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:30:18 localhost podman[88208]: 2026-02-23 08:30:18.082398999 +0000 UTC m=+0.117002375 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:30:18 localhost podman[88215]: 2026-02-23 08:30:18.087052406 +0000 UTC m=+0.118218674 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:30:18 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:30:18 localhost podman[88207]: 2026-02-23 08:30:18.358312251 +0000 UTC m=+0.395046345 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:30:18 localhost podman[88206]: 2026-02-23 08:30:18.362998199 +0000 UTC m=+0.403001866 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64) Feb 23 03:30:18 localhost podman[88206]: 2026-02-23 08:30:18.381276386 +0000 UTC m=+0.421280083 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:30:18 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:30:18 localhost podman[88207]: 2026-02-23 08:30:18.408293419 +0000 UTC m=+0.445027493 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute) Feb 23 03:30:18 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:30:18 localhost podman[88209]: 2026-02-23 08:30:18.426216875 +0000 UTC m=+0.459314444 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, tcib_managed=true, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 23 03:30:18 localhost podman[88215]: 2026-02-23 08:30:18.432226385 +0000 UTC m=+0.463392653 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:30:18 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:30:18 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:30:19 localhost systemd[1]: tmp-crun.RPix3z.mount: Deactivated successfully. Feb 23 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:30:20 localhost systemd[1]: tmp-crun.4Df6Fo.mount: Deactivated successfully. Feb 23 03:30:20 localhost podman[88320]: 2026-02-23 08:30:20.920797701 +0000 UTC m=+0.094621149 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:30:21 localhost podman[88320]: 2026-02-23 08:30:21.341312319 +0000 UTC m=+0.515135777 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-nova-compute-container, release=1766032510, io.openshift.expose-services=, container_name=nova_migration_target, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:30:21 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:30:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:30:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:30:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:30:22 localhost podman[88346]: 2026-02-23 08:30:22.903521885 +0000 UTC m=+0.075035050 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:30:22 localhost systemd[1]: tmp-crun.dtQzxd.mount: Deactivated successfully. Feb 23 03:30:22 localhost podman[88345]: 2026-02-23 08:30:22.922798404 +0000 UTC m=+0.096204690 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:30:22 localhost podman[88345]: 2026-02-23 08:30:22.991798982 +0000 UTC m=+0.165205328 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Feb 23 03:30:23 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:30:23 localhost podman[88344]: 2026-02-23 08:30:22.997797451 +0000 UTC m=+0.172562239 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.5) Feb 23 03:30:23 localhost podman[88344]: 2026-02-23 08:30:23.082328171 +0000 UTC m=+0.257092949 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:30:23 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:30:23 localhost podman[88346]: 2026-02-23 08:30:23.134478237 +0000 UTC m=+0.305991402 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:30:23 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:30:23 localhost systemd[1]: tmp-crun.CtWVg0.mount: Deactivated successfully. Feb 23 03:30:43 localhost sshd[88464]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:30:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:30:44 localhost systemd[1]: tmp-crun.VI8bHF.mount: Deactivated successfully. Feb 23 03:30:44 localhost podman[88466]: 2026-02-23 08:30:44.937312306 +0000 UTC m=+0.104726437 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, architecture=x86_64) Feb 23 03:30:44 localhost podman[88466]: 2026-02-23 08:30:44.952355261 +0000 UTC m=+0.119769422 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z) Feb 23 03:30:44 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:30:48 localhost sshd[88485]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:30:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:30:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:30:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:30:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:30:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:30:48 localhost podman[88489]: 2026-02-23 08:30:48.925614817 +0000 UTC m=+0.088758994 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron) Feb 23 03:30:48 localhost systemd[1]: tmp-crun.72JDQk.mount: Deactivated successfully. Feb 23 03:30:48 localhost podman[88487]: 2026-02-23 08:30:48.972029582 +0000 UTC m=+0.139877688 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 23 03:30:49 localhost podman[88489]: 2026-02-23 08:30:49.011379904 +0000 UTC m=+0.174524051 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:30:49 localhost podman[88487]: 2026-02-23 08:30:49.020951107 +0000 UTC m=+0.188799213 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64) Feb 23 03:30:49 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:30:49 localhost podman[88486]: 2026-02-23 08:30:49.030701285 +0000 UTC m=+0.198894071 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, container_name=iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:30:49 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:30:49 localhost podman[88486]: 2026-02-23 08:30:49.067339671 +0000 UTC m=+0.235532427 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.expose-services=, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:30:49 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:30:49 localhost podman[88500]: 2026-02-23 08:30:48.950619507 +0000 UTC m=+0.103792248 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:30:49 localhost podman[88488]: 2026-02-23 08:30:49.071465272 +0000 UTC m=+0.235375743 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git) Feb 23 03:30:49 localhost podman[88500]: 2026-02-23 08:30:49.134575094 +0000 UTC m=+0.287747865 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510) Feb 23 03:30:49 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:30:49 localhost podman[88488]: 2026-02-23 08:30:49.155397782 +0000 UTC m=+0.319308233 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 23 03:30:49 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:30:49 localhost systemd[1]: tmp-crun.N2GObt.mount: Deactivated successfully. Feb 23 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:30:51 localhost systemd[1]: tmp-crun.cQOpZe.mount: Deactivated successfully. Feb 23 03:30:51 localhost podman[88603]: 2026-02-23 08:30:51.916312447 +0000 UTC m=+0.089384754 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:30:52 localhost podman[88603]: 2026-02-23 08:30:52.297058759 +0000 UTC m=+0.470131086 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510) Feb 23 03:30:52 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:30:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:30:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:30:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:30:53 localhost podman[88626]: 2026-02-23 08:30:53.911649349 +0000 UTC m=+0.076653681 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=) Feb 23 03:30:53 localhost podman[88626]: 2026-02-23 08:30:53.963301571 +0000 UTC m=+0.128305943 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1) Feb 23 03:30:53 localhost systemd[1]: tmp-crun.zrVwQE.mount: Deactivated successfully. Feb 23 03:30:53 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:30:53 localhost podman[88628]: 2026-02-23 08:30:53.98513191 +0000 UTC m=+0.145465654 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team) Feb 23 03:30:54 localhost podman[88627]: 2026-02-23 08:30:54.02376985 +0000 UTC m=+0.187531032 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 23 03:30:54 localhost podman[88627]: 2026-02-23 08:30:54.072344234 +0000 UTC m=+0.236105406 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 23 03:30:54 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:30:54 localhost podman[88628]: 2026-02-23 08:30:54.199662003 +0000 UTC m=+0.359995717 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, release=1766032510, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:30:54 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:31:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:31:15 localhost systemd[1]: tmp-crun.ZVfIIo.mount: Deactivated successfully. Feb 23 03:31:15 localhost podman[88777]: 2026-02-23 08:31:15.903531557 +0000 UTC m=+0.076209527 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:31:15 localhost podman[88777]: 2026-02-23 08:31:15.915238687 +0000 UTC m=+0.087916687 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:31:15 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:31:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:31:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:31:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:31:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:31:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:31:19 localhost systemd[1]: tmp-crun.shRjf6.mount: Deactivated successfully. Feb 23 03:31:19 localhost podman[88801]: 2026-02-23 08:31:19.983041557 +0000 UTC m=+0.146214898 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_compute) Feb 23 03:31:19 localhost podman[88797]: 2026-02-23 08:31:19.934915847 +0000 UTC m=+0.110453729 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com) Feb 23 03:31:20 localhost podman[88797]: 2026-02-23 08:31:20.014617563 +0000 UTC m=+0.190155375 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, version=17.1.13) Feb 23 03:31:20 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:31:20 localhost podman[88799]: 2026-02-23 08:31:20.02715019 +0000 UTC m=+0.197916631 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:31:20 localhost podman[88800]: 2026-02-23 08:31:20.043193886 +0000 UTC m=+0.208754412 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:31:20 localhost podman[88799]: 2026-02-23 08:31:20.05851177 +0000 UTC m=+0.229278241 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z) Feb 23 03:31:20 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:31:20 localhost podman[88798]: 2026-02-23 08:31:20.072076108 +0000 UTC m=+0.245961957 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:31:20 localhost podman[88800]: 2026-02-23 08:31:20.126332001 +0000 UTC m=+0.291892517 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron) Feb 23 03:31:20 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:31:20 localhost podman[88801]: 2026-02-23 08:31:20.146133906 +0000 UTC m=+0.309307287 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:31:20 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:31:20 localhost podman[88798]: 2026-02-23 08:31:20.180545363 +0000 UTC m=+0.354431202 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 23 03:31:20 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:31:20 localhost sshd[88912]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:31:20 localhost systemd[1]: tmp-crun.LaLP9m.mount: Deactivated successfully. Feb 23 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:31:22 localhost systemd[1]: tmp-crun.CrN1nI.mount: Deactivated successfully. Feb 23 03:31:22 localhost podman[88914]: 2026-02-23 08:31:22.929387748 +0000 UTC m=+0.099373890 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:31:23 localhost podman[88914]: 2026-02-23 08:31:23.309416843 +0000 UTC m=+0.479402965 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:31:23 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:31:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:31:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:31:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:31:24 localhost systemd[1]: tmp-crun.QoGgeI.mount: Deactivated successfully. Feb 23 03:31:24 localhost systemd[1]: tmp-crun.mgwdLU.mount: Deactivated successfully. Feb 23 03:31:24 localhost podman[88938]: 2026-02-23 08:31:24.975236067 +0000 UTC m=+0.150085011 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.5, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z) Feb 23 03:31:25 localhost podman[88940]: 2026-02-23 08:31:25.024756841 +0000 UTC m=+0.194518906 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:31:25 localhost podman[88939]: 2026-02-23 08:31:24.940517647 +0000 UTC m=+0.115032290 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git) Feb 23 03:31:25 localhost podman[88938]: 2026-02-23 08:31:25.052806311 +0000 UTC m=+0.227655275 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4) Feb 23 03:31:25 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:31:25 localhost podman[88939]: 2026-02-23 08:31:25.072257351 +0000 UTC m=+0.246771965 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, release=1766032510, vcs-type=git, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, batch=17.1_20260112.1) Feb 23 03:31:25 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:31:25 localhost podman[88940]: 2026-02-23 08:31:25.202326443 +0000 UTC m=+0.372088528 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510) Feb 23 03:31:25 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:31:41 localhost sshd[89035]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:31:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:31:46 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:31:46 localhost recover_tripleo_nova_virtqemud[89041]: 61982 Feb 23 03:31:46 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:31:46 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:31:46 localhost podman[89037]: 2026-02-23 08:31:46.915262847 +0000 UTC m=+0.092619148 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, container_name=collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container) Feb 23 03:31:46 localhost podman[89037]: 2026-02-23 08:31:46.925277451 +0000 UTC m=+0.102633752 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, architecture=x86_64) Feb 23 03:31:46 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:31:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:31:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:31:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:31:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:31:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:31:50 localhost systemd[1]: tmp-crun.2b3ZlK.mount: Deactivated successfully. Feb 23 03:31:50 localhost podman[89060]: 2026-02-23 08:31:50.909839207 +0000 UTC m=+0.078042990 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public) Feb 23 03:31:50 localhost podman[89059]: 2026-02-23 08:31:50.96891301 +0000 UTC m=+0.136978909 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:31:50 localhost podman[89059]: 2026-02-23 08:31:50.976715915 +0000 UTC m=+0.144781824 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, release=1766032510, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step3) Feb 23 03:31:50 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:31:50 localhost podman[89060]: 2026-02-23 08:31:50.990409955 +0000 UTC m=+0.158613748 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:31:50 localhost podman[89062]: 2026-02-23 08:31:50.945498176 +0000 UTC m=+0.102873040 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, container_name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:31:51 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:31:51 localhost podman[89062]: 2026-02-23 08:31:51.02720181 +0000 UTC m=+0.184576684 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, release=1766032510, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc.) Feb 23 03:31:51 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:31:51 localhost podman[89061]: 2026-02-23 08:31:51.076500086 +0000 UTC m=+0.237857305 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:31:51 localhost podman[89061]: 2026-02-23 08:31:51.13239313 +0000 UTC m=+0.293750329 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:31:51 localhost podman[89073]: 2026-02-23 08:31:51.140795774 +0000 UTC m=+0.290553928 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64) Feb 23 03:31:51 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:31:51 localhost podman[89073]: 2026-02-23 08:31:51.175299547 +0000 UTC m=+0.325057691 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64) Feb 23 03:31:51 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:31:53 localhost systemd[1]: tmp-crun.tLDK2e.mount: Deactivated successfully. Feb 23 03:31:53 localhost podman[89176]: 2026-02-23 08:31:53.920422827 +0000 UTC m=+0.094268408 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1766032510, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:31:54 localhost podman[89176]: 2026-02-23 08:31:54.350210385 +0000 UTC m=+0.524056006 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z) Feb 23 03:31:54 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:31:55 localhost podman[89200]: 2026-02-23 08:31:55.900894945 +0000 UTC m=+0.072889508 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:31:55 localhost systemd[1]: tmp-crun.u4W6rI.mount: Deactivated successfully. Feb 23 03:31:55 localhost podman[89199]: 2026-02-23 08:31:55.970274553 +0000 UTC m=+0.144327120 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, tcib_managed=true, release=1766032510, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z) Feb 23 03:31:55 localhost podman[89200]: 2026-02-23 08:31:55.97336945 +0000 UTC m=+0.145364003 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:31:55 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:31:56 localhost podman[89199]: 2026-02-23 08:31:56.019385894 +0000 UTC m=+0.193438431 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, release=1766032510, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:31:56 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:31:56 localhost podman[89201]: 2026-02-23 08:31:56.067620807 +0000 UTC m=+0.235933845 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, tcib_managed=true) Feb 23 03:31:56 localhost podman[89201]: 2026-02-23 08:31:56.285482094 +0000 UTC m=+0.453795151 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:31:56 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:31:59 localhost sshd[89274]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:32:13 localhost podman[89379]: 2026-02-23 08:32:13.171164799 +0000 UTC m=+0.083711928 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 03:32:13 localhost podman[89379]: 2026-02-23 08:32:13.308489688 +0000 UTC m=+0.221036867 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.42.2, GIT_CLEAN=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-type=git, release=1770267347) Feb 23 03:32:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:32:17 localhost podman[89522]: 2026-02-23 08:32:17.92896796 +0000 UTC m=+0.099577286 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, name=rhosp-rhel9/openstack-collectd) Feb 23 03:32:17 localhost podman[89522]: 2026-02-23 08:32:17.963805132 +0000 UTC m=+0.134414318 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=collectd) Feb 23 03:32:17 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:32:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:32:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:32:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:32:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:32:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:32:21 localhost systemd[1]: tmp-crun.Ztkzj6.mount: Deactivated successfully. Feb 23 03:32:21 localhost podman[89545]: 2026-02-23 08:32:21.915431655 +0000 UTC m=+0.082388727 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1) Feb 23 03:32:21 localhost podman[89543]: 2026-02-23 08:32:21.964885626 +0000 UTC m=+0.140521381 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 23 03:32:21 localhost podman[89545]: 2026-02-23 08:32:21.969297865 +0000 UTC m=+0.136254957 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1) Feb 23 03:32:21 localhost podman[89543]: 2026-02-23 08:32:21.978353368 +0000 UTC m=+0.153989143 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, url=https://www.redhat.com, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:32:21 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:32:21 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:32:22 localhost podman[89553]: 2026-02-23 08:32:22.021987788 +0000 UTC m=+0.180850856 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step5, container_name=nova_compute, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:32:22 localhost podman[89553]: 2026-02-23 08:32:22.059966669 +0000 UTC m=+0.218829737 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, container_name=nova_compute, vendor=Red Hat, Inc., config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 03:32:22 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:32:22 localhost podman[89544]: 2026-02-23 08:32:22.076094056 +0000 UTC m=+0.246297470 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, distribution-scope=public) Feb 23 03:32:22 localhost podman[89544]: 2026-02-23 08:32:22.107179451 +0000 UTC m=+0.277382875 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5) Feb 23 03:32:22 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:32:22 localhost podman[89551]: 2026-02-23 08:32:22.125815376 +0000 UTC m=+0.285879802 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1) Feb 23 03:32:22 localhost podman[89551]: 2026-02-23 08:32:22.131078752 +0000 UTC m=+0.291143168 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 23 03:32:22 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:32:24 localhost podman[89662]: 2026-02-23 08:32:24.894169548 +0000 UTC m=+0.068908714 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:32:25 localhost podman[89662]: 2026-02-23 08:32:25.272999075 +0000 UTC m=+0.447738221 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:32:25 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:32:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:32:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:32:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:32:26 localhost podman[89684]: 2026-02-23 08:32:26.906834785 +0000 UTC m=+0.084909455 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_controller, distribution-scope=public, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:32:26 localhost systemd[1]: tmp-crun.5EUDWj.mount: Deactivated successfully. Feb 23 03:32:26 localhost podman[89685]: 2026-02-23 08:32:26.965095433 +0000 UTC m=+0.138638361 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, container_name=ovn_metadata_agent, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 23 03:32:27 localhost systemd[1]: tmp-crun.qNAicI.mount: Deactivated successfully. Feb 23 03:32:27 localhost podman[89686]: 2026-02-23 08:32:27.026935344 +0000 UTC m=+0.198880682 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, container_name=metrics_qdr) Feb 23 03:32:27 localhost podman[89684]: 2026-02-23 08:32:27.032493358 +0000 UTC m=+0.210568078 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Feb 23 03:32:27 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:32:27 localhost podman[89685]: 2026-02-23 08:32:27.053746456 +0000 UTC m=+0.227289374 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5) Feb 23 03:32:27 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:32:27 localhost podman[89686]: 2026-02-23 08:32:27.217316758 +0000 UTC m=+0.389262066 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:32:27 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:32:32 localhost sshd[89759]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:32:40 localhost sshd[89784]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:32:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:32:48 localhost podman[89786]: 2026-02-23 08:32:48.922505075 +0000 UTC m=+0.086729951 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Feb 23 03:32:48 localhost podman[89786]: 2026-02-23 08:32:48.933261744 +0000 UTC m=+0.097486610 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team) Feb 23 03:32:48 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:32:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:32:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:32:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:32:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:32:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:32:52 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:32:52 localhost recover_tripleo_nova_virtqemud[89837]: 61982 Feb 23 03:32:52 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:32:52 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:32:52 localhost systemd[1]: tmp-crun.PRM2Ir.mount: Deactivated successfully. Feb 23 03:32:52 localhost podman[89809]: 2026-02-23 08:32:52.930811154 +0000 UTC m=+0.097660726 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.buildah.version=1.41.5) Feb 23 03:32:52 localhost podman[89809]: 2026-02-23 08:32:52.946380934 +0000 UTC m=+0.113230486 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, version=17.1.13, name=rhosp-rhel9/openstack-cron, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 23 03:32:52 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:32:52 localhost podman[89806]: 2026-02-23 08:32:52.987337088 +0000 UTC m=+0.160252192 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, version=17.1.13, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, release=1766032510, container_name=iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:32:52 localhost podman[89815]: 2026-02-23 08:32:52.942303257 +0000 UTC m=+0.101555845 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:32:53 localhost podman[89806]: 2026-02-23 08:32:53.025212861 +0000 UTC m=+0.198127995 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13) Feb 23 03:32:53 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:32:53 localhost podman[89815]: 2026-02-23 08:32:53.075412374 +0000 UTC m=+0.234665032 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, release=1766032510, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:32:53 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:32:53 localhost podman[89807]: 2026-02-23 08:32:53.095590086 +0000 UTC m=+0.263932887 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4) Feb 23 03:32:53 localhost podman[89808]: 2026-02-23 08:32:53.026836944 +0000 UTC m=+0.194286695 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 03:32:53 localhost podman[89808]: 2026-02-23 08:32:53.158037547 +0000 UTC m=+0.325487308 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4) Feb 23 03:32:53 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:32:53 localhost podman[89807]: 2026-02-23 08:32:53.180271568 +0000 UTC m=+0.348614419 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13) Feb 23 03:32:53 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:32:55 localhost systemd[1]: tmp-crun.dezNwA.mount: Deactivated successfully. Feb 23 03:32:55 localhost podman[89925]: 2026-02-23 08:32:55.923774413 +0000 UTC m=+0.093497687 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, release=1766032510, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:32:56 localhost podman[89925]: 2026-02-23 08:32:56.357409606 +0000 UTC m=+0.527132850 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Feb 23 03:32:56 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:32:57 localhost systemd[1]: tmp-crun.WWaX8D.mount: Deactivated successfully. Feb 23 03:32:57 localhost podman[89950]: 2026-02-23 08:32:57.921644059 +0000 UTC m=+0.087008401 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 23 03:32:57 localhost podman[89949]: 2026-02-23 08:32:57.980334745 +0000 UTC m=+0.148794291 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, tcib_managed=true) Feb 23 03:32:58 localhost podman[89949]: 2026-02-23 08:32:58.035334477 +0000 UTC m=+0.203794053 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:32:58 localhost podman[89948]: 2026-02-23 08:32:58.078809287 +0000 UTC m=+0.249156455 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git) Feb 23 03:32:58 localhost podman[89948]: 2026-02-23 08:32:58.104232534 +0000 UTC m=+0.274579692 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:32:58 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:32:58 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:32:58 localhost podman[89950]: 2026-02-23 08:32:58.128330117 +0000 UTC m=+0.293694379 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, container_name=metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:32:58 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:32:58 localhost systemd[1]: tmp-crun.RwNjmR.mount: Deactivated successfully. Feb 23 03:33:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=46574 SEQ=0 ACK=574472157 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:33:02 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:9a:b6:c6 MACPROTO=0800 SRC=167.248.133.126 DST=38.102.83.164 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=0 PROTO=TCP SPT=25857 DPT=19885 SEQ=4029011176 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B40303070402080A699C10BE0000000000) Feb 23 03:33:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:33:20 localhost systemd[1]: tmp-crun.JEmfzM.mount: Deactivated successfully. Feb 23 03:33:20 localhost podman[90102]: 2026-02-23 08:33:20.037486447 +0000 UTC m=+0.209583957 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, version=17.1.13, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:33:20 localhost podman[90102]: 2026-02-23 08:33:20.090294587 +0000 UTC m=+0.262392067 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, release=1766032510, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:33:20 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:33:21 localhost sshd[90122]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:33:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:33:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:33:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:33:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:33:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:33:24 localhost systemd[1]: tmp-crun.xALL3z.mount: Deactivated successfully. Feb 23 03:33:24 localhost podman[90124]: 2026-02-23 08:33:24.060797335 +0000 UTC m=+0.220678185 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:33:24 localhost podman[90125]: 2026-02-23 08:33:24.151998725 +0000 UTC m=+0.310090196 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc.) Feb 23 03:33:24 localhost podman[90127]: 2026-02-23 08:33:24.014047238 +0000 UTC m=+0.165321661 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Feb 23 03:33:24 localhost podman[90125]: 2026-02-23 08:33:24.212492122 +0000 UTC m=+0.370583603 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, url=https://www.redhat.com, tcib_managed=true) Feb 23 03:33:24 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:33:24 localhost podman[90127]: 2026-02-23 08:33:24.300818895 +0000 UTC m=+0.452093338 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:33:24 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:33:24 localhost podman[90126]: 2026-02-23 08:33:24.118443457 +0000 UTC m=+0.274142358 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:33:24 localhost podman[90126]: 2026-02-23 08:33:24.413288233 +0000 UTC m=+0.568987084 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible) Feb 23 03:33:24 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:33:24 localhost podman[90128]: 2026-02-23 08:33:23.919069662 +0000 UTC m=+0.073171819 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5) Feb 23 03:33:24 localhost podman[90124]: 2026-02-23 08:33:24.526486337 +0000 UTC m=+0.686367217 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:33:24 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:33:24 localhost podman[90128]: 2026-02-23 08:33:24.586352721 +0000 UTC m=+0.740454918 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.) Feb 23 03:33:24 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:33:26 localhost sshd[90236]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:33:26 localhost podman[90238]: 2026-02-23 08:33:26.910927926 +0000 UTC m=+0.080296998 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510) Feb 23 03:33:27 localhost podman[90238]: 2026-02-23 08:33:27.322558905 +0000 UTC m=+0.491927977 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git, batch=17.1_20260112.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Feb 23 03:33:27 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:33:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:33:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:33:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:33:28 localhost systemd[1]: tmp-crun.k5XMCe.mount: Deactivated successfully. Feb 23 03:33:28 localhost podman[90261]: 2026-02-23 08:33:28.92110285 +0000 UTC m=+0.097865773 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:33:28 localhost podman[90261]: 2026-02-23 08:33:28.946522007 +0000 UTC m=+0.123284900 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container) Feb 23 03:33:28 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:33:28 localhost podman[90263]: 2026-02-23 08:33:28.971464298 +0000 UTC m=+0.142085715 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible) Feb 23 03:33:29 localhost podman[90262]: 2026-02-23 08:33:29.01469017 +0000 UTC m=+0.186685724 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z) Feb 23 03:33:29 localhost podman[90262]: 2026-02-23 08:33:29.054236437 +0000 UTC m=+0.226232001 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team) Feb 23 03:33:29 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:33:29 localhost podman[90263]: 2026-02-23 08:33:29.175392615 +0000 UTC m=+0.346014092 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:33:29 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:33:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:33:50 localhost podman[90358]: 2026-02-23 08:33:50.907027475 +0000 UTC m=+0.076847513 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, tcib_managed=true) Feb 23 03:33:50 localhost podman[90358]: 2026-02-23 08:33:50.916623555 +0000 UTC m=+0.086443573 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:33:50 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:33:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:33:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:33:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:33:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:33:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:33:54 localhost systemd[1]: tmp-crun.lGuv9f.mount: Deactivated successfully. Feb 23 03:33:54 localhost podman[90381]: 2026-02-23 08:33:54.956860078 +0000 UTC m=+0.095027028 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4) Feb 23 03:33:54 localhost podman[90381]: 2026-02-23 08:33:54.982313306 +0000 UTC m=+0.120480246 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_compute, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 23 03:33:54 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:33:55 localhost podman[90383]: 2026-02-23 08:33:55.038126167 +0000 UTC m=+0.197785613 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:33:55 localhost podman[90383]: 2026-02-23 08:33:55.051216883 +0000 UTC m=+0.210876329 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, vcs-type=git, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:33:55 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:33:55 localhost podman[90380]: 2026-02-23 08:33:54.938392063 +0000 UTC m=+0.108811497 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:33:55 localhost podman[90382]: 2026-02-23 08:33:55.097731333 +0000 UTC m=+0.261157015 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510) Feb 23 03:33:55 localhost podman[90389]: 2026-02-23 08:33:55.150350887 +0000 UTC m=+0.309225487 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true) Feb 23 03:33:55 localhost podman[90382]: 2026-02-23 08:33:55.167350033 +0000 UTC m=+0.330775695 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:33:55 localhost podman[90380]: 2026-02-23 08:33:55.17474109 +0000 UTC m=+0.345160554 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:33:55 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:33:55 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:33:55 localhost podman[90389]: 2026-02-23 08:33:55.207341727 +0000 UTC m=+0.366216307 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:33:55 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:33:57 localhost podman[90494]: 2026-02-23 08:33:57.903013808 +0000 UTC m=+0.081315821 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc.) Feb 23 03:33:58 localhost podman[90494]: 2026-02-23 08:33:58.371068767 +0000 UTC m=+0.549370750 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-type=git, release=1766032510) Feb 23 03:33:58 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:33:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:33:59 localhost podman[90520]: 2026-02-23 08:33:59.946653758 +0000 UTC m=+0.122842944 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com) Feb 23 03:33:59 localhost podman[90520]: 2026-02-23 08:33:59.969186599 +0000 UTC m=+0.145375805 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5) Feb 23 03:33:59 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:34:00 localhost systemd[1]: tmp-crun.HB2RQl.mount: Deactivated successfully. Feb 23 03:34:00 localhost podman[90521]: 2026-02-23 08:34:00.061977032 +0000 UTC m=+0.235166299 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team) Feb 23 03:34:00 localhost podman[90521]: 2026-02-23 08:34:00.101667055 +0000 UTC m=+0.274856372 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 23 03:34:00 localhost podman[90522]: 2026-02-23 08:34:00.115211336 +0000 UTC m=+0.284127091 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr) Feb 23 03:34:00 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:34:00 localhost podman[90522]: 2026-02-23 08:34:00.31246108 +0000 UTC m=+0.481376815 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public) Feb 23 03:34:00 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:34:02 localhost sshd[90596]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:34:18 localhost sshd[90676]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:34:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:34:21 localhost podman[90678]: 2026-02-23 08:34:21.935009232 +0000 UTC m=+0.090936175 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1766032510, container_name=collectd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, distribution-scope=public) Feb 23 03:34:21 localhost podman[90678]: 2026-02-23 08:34:21.947072384 +0000 UTC m=+0.102999297 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-collectd, tcib_managed=true) Feb 23 03:34:21 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:34:22 localhost sshd[90698]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:34:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:34:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:34:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:34:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:34:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:34:25 localhost podman[90700]: 2026-02-23 08:34:25.916319044 +0000 UTC m=+0.089527373 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public) Feb 23 03:34:25 localhost podman[90708]: 2026-02-23 08:34:25.962348618 +0000 UTC m=+0.125982293 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:34:25 localhost systemd[1]: tmp-crun.9pf08y.mount: Deactivated successfully. Feb 23 03:34:26 localhost podman[90702]: 2026-02-23 08:34:26.020942159 +0000 UTC m=+0.185404229 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:34:26 localhost podman[90709]: 2026-02-23 08:34:25.985250262 +0000 UTC m=+0.141793149 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:34:26 localhost podman[90708]: 2026-02-23 08:34:26.045381171 +0000 UTC m=+0.209014876 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-cron-container) Feb 23 03:34:26 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:34:26 localhost podman[90709]: 2026-02-23 08:34:26.07332089 +0000 UTC m=+0.229863737 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute) Feb 23 03:34:26 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:34:26 localhost podman[90700]: 2026-02-23 08:34:26.099445392 +0000 UTC m=+0.272653731 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.buildah.version=1.41.5, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Feb 23 03:34:26 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:34:26 localhost podman[90702]: 2026-02-23 08:34:26.129466575 +0000 UTC m=+0.293928645 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, config_id=tripleo_step4) Feb 23 03:34:26 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:34:26 localhost podman[90701]: 2026-02-23 08:34:26.20153707 +0000 UTC m=+0.366428643 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 23 03:34:26 localhost podman[90701]: 2026-02-23 08:34:26.238347722 +0000 UTC m=+0.403239275 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 23 03:34:26 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:34:28 localhost podman[90816]: 2026-02-23 08:34:28.9214352 +0000 UTC m=+0.095397063 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z) Feb 23 03:34:29 localhost podman[90816]: 2026-02-23 08:34:29.339520741 +0000 UTC m=+0.513482624 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, container_name=nova_migration_target) Feb 23 03:34:29 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:34:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:34:30 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:34:30 localhost recover_tripleo_nova_virtqemud[90860]: 61982 Feb 23 03:34:30 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:34:30 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:34:30 localhost systemd[1]: tmp-crun.WrMliu.mount: Deactivated successfully. Feb 23 03:34:30 localhost podman[90843]: 2026-02-23 08:34:30.926027145 +0000 UTC m=+0.092969039 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1766032510) Feb 23 03:34:30 localhost systemd[1]: tmp-crun.9CkvkQ.mount: Deactivated successfully. Feb 23 03:34:30 localhost podman[90842]: 2026-02-23 08:34:30.985175212 +0000 UTC m=+0.153808398 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:34:31 localhost podman[90841]: 2026-02-23 08:34:31.031409134 +0000 UTC m=+0.203918919 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:34:31 localhost podman[90842]: 2026-02-23 08:34:31.035096727 +0000 UTC m=+0.203729933 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=) Feb 23 03:34:31 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:34:31 localhost podman[90841]: 2026-02-23 08:34:31.066219714 +0000 UTC m=+0.238729479 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc.) Feb 23 03:34:31 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:34:31 localhost podman[90843]: 2026-02-23 08:34:31.130070006 +0000 UTC m=+0.297011810 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Feb 23 03:34:31 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:34:44 localhost sshd[90942]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:34:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:34:52 localhost systemd[1]: tmp-crun.FfSYdT.mount: Deactivated successfully. Feb 23 03:34:52 localhost podman[90944]: 2026-02-23 08:34:52.918576575 +0000 UTC m=+0.088085251 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, container_name=collectd) Feb 23 03:34:52 localhost podman[90944]: 2026-02-23 08:34:52.932184279 +0000 UTC m=+0.101692925 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true) Feb 23 03:34:52 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:34:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:34:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:34:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:34:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:34:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:34:56 localhost systemd[1]: tmp-crun.dXElnZ.mount: Deactivated successfully. Feb 23 03:34:56 localhost podman[90969]: 2026-02-23 08:34:56.930104653 +0000 UTC m=+0.094893628 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13) Feb 23 03:34:56 localhost systemd[1]: tmp-crun.yn3lHb.mount: Deactivated successfully. Feb 23 03:34:56 localhost podman[90964]: 2026-02-23 08:34:56.962149786 +0000 UTC m=+0.134007177 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vendor=Red Hat, Inc.) Feb 23 03:34:56 localhost podman[90965]: 2026-02-23 08:34:56.938226672 +0000 UTC m=+0.102799520 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, version=17.1.13, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 23 03:34:57 localhost podman[90965]: 2026-02-23 08:34:57.02434187 +0000 UTC m=+0.188914738 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 23 03:34:57 localhost podman[90966]: 2026-02-23 08:34:56.982712352 +0000 UTC m=+0.144224602 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=) Feb 23 03:34:57 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:34:57 localhost podman[90973]: 2026-02-23 08:34:57.039962769 +0000 UTC m=+0.195610042 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:34:57 localhost podman[90969]: 2026-02-23 08:34:57.064547953 +0000 UTC m=+0.229336908 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4) Feb 23 03:34:57 localhost podman[90973]: 2026-02-23 08:34:57.06820122 +0000 UTC m=+0.223848513 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:34:57 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:34:57 localhost podman[90964]: 2026-02-23 08:34:57.093474626 +0000 UTC m=+0.265332017 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, container_name=iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 23 03:34:57 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:34:57 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:34:57 localhost podman[90966]: 2026-02-23 08:34:57.169368077 +0000 UTC m=+0.330880307 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1) Feb 23 03:34:57 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:34:59 localhost podman[91085]: 2026-02-23 08:34:59.905900877 +0000 UTC m=+0.082202815 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:35:00 localhost podman[91085]: 2026-02-23 08:35:00.289576798 +0000 UTC m=+0.465878726 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 03:35:00 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:35:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:35:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:35:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:35:01 localhost podman[91108]: 2026-02-23 08:35:01.917752565 +0000 UTC m=+0.096327215 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 03:35:01 localhost podman[91108]: 2026-02-23 08:35:01.96839479 +0000 UTC m=+0.146969440 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 03:35:01 localhost systemd[1]: tmp-crun.hGufWv.mount: Deactivated successfully. Feb 23 03:35:01 localhost podman[91109]: 2026-02-23 08:35:01.977587433 +0000 UTC m=+0.151230136 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:35:01 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:35:02 localhost podman[91109]: 2026-02-23 08:35:02.025788741 +0000 UTC m=+0.199431404 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, release=1766032510) Feb 23 03:35:02 localhost podman[91110]: 2026-02-23 08:35:02.035266054 +0000 UTC m=+0.207208252 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:35:02 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:35:02 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:9a:b6:c6 MACPROTO=0800 SRC=167.248.133.126 DST=38.102.83.164 LEN=60 TOS=0x00 PREC=0x00 TTL=50 ID=0 PROTO=TCP SPT=46677 DPT=19885 SEQ=3435006332 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B40303070402080A699C11360000000000) Feb 23 03:35:02 localhost podman[91110]: 2026-02-23 08:35:02.250520482 +0000 UTC m=+0.422462610 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step1, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com) Feb 23 03:35:02 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:35:07 localhost sshd[91182]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:35:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:35:23 localhost podman[91261]: 2026-02-23 08:35:23.911055723 +0000 UTC m=+0.083465284 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:35:23 localhost podman[91261]: 2026-02-23 08:35:23.952477444 +0000 UTC m=+0.124886995 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, release=1766032510, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:35:23 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:35:24 localhost sshd[91281]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:35:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:35:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:35:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:35:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:35:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:35:27 localhost systemd[1]: tmp-crun.XvzbRx.mount: Deactivated successfully. Feb 23 03:35:27 localhost podman[91285]: 2026-02-23 08:35:27.929269146 +0000 UTC m=+0.091617025 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, distribution-scope=public, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:35:27 localhost podman[91285]: 2026-02-23 08:35:27.965223103 +0000 UTC m=+0.127570942 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, tcib_managed=true, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git) Feb 23 03:35:27 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:35:27 localhost podman[91291]: 2026-02-23 08:35:27.987447442 +0000 UTC m=+0.145168883 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z) Feb 23 03:35:28 localhost podman[91291]: 2026-02-23 08:35:28.018266455 +0000 UTC m=+0.175987896 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute) Feb 23 03:35:28 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:35:28 localhost podman[91284]: 2026-02-23 08:35:28.030539077 +0000 UTC m=+0.194211407 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:35:28 localhost podman[91283]: 2026-02-23 08:35:28.083470276 +0000 UTC m=+0.251865387 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step3) Feb 23 03:35:28 localhost podman[91284]: 2026-02-23 08:35:28.091357737 +0000 UTC m=+0.255030127 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-type=git) Feb 23 03:35:28 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:35:28 localhost podman[91286]: 2026-02-23 08:35:28.143999527 +0000 UTC m=+0.301068617 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, distribution-scope=public) Feb 23 03:35:28 localhost podman[91286]: 2026-02-23 08:35:28.155177594 +0000 UTC m=+0.312246664 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, managed_by=tripleo_ansible) Feb 23 03:35:28 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:35:28 localhost podman[91283]: 2026-02-23 08:35:28.1751201 +0000 UTC m=+0.343515221 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, vcs-type=git, release=1766032510, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public) Feb 23 03:35:28 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:35:28 localhost systemd[1]: tmp-crun.OGlHXQ.mount: Deactivated successfully. Feb 23 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:35:30 localhost podman[91396]: 2026-02-23 08:35:30.897593642 +0000 UTC m=+0.075537131 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.openshift.expose-services=) Feb 23 03:35:31 localhost podman[91396]: 2026-02-23 08:35:31.253501307 +0000 UTC m=+0.431444806 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Feb 23 03:35:31 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:35:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:35:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:35:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:35:32 localhost podman[91420]: 2026-02-23 08:35:32.920139252 +0000 UTC m=+0.088354500 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true) Feb 23 03:35:32 localhost podman[91419]: 2026-02-23 08:35:32.903951895 +0000 UTC m=+0.077555615 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com) Feb 23 03:35:32 localhost podman[91420]: 2026-02-23 08:35:32.969198347 +0000 UTC m=+0.137413575 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:35:32 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:35:32 localhost podman[91419]: 2026-02-23 08:35:32.988264865 +0000 UTC m=+0.161868555 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public) Feb 23 03:35:33 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:35:33 localhost podman[91421]: 2026-02-23 08:35:32.972030067 +0000 UTC m=+0.136364232 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, release=1766032510, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:35:33 localhost podman[91421]: 2026-02-23 08:35:33.15326069 +0000 UTC m=+0.317594905 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, container_name=metrics_qdr, release=1766032510, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Feb 23 03:35:33 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:35:33 localhost systemd[1]: tmp-crun.RmNRg3.mount: Deactivated successfully. Feb 23 03:35:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:35:47 localhost recover_tripleo_nova_virtqemud[91515]: 61982 Feb 23 03:35:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:35:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:35:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:35:54 localhost podman[91516]: 2026-02-23 08:35:54.923569455 +0000 UTC m=+0.094481236 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, tcib_managed=true, io.openshift.expose-services=, container_name=collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git) Feb 23 03:35:54 localhost podman[91516]: 2026-02-23 08:35:54.93439811 +0000 UTC m=+0.105309891 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.13) Feb 23 03:35:54 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:35:56 localhost sshd[91536]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:35:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:35:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:35:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:35:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:35:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:35:58 localhost systemd[1]: tmp-crun.lKJz9m.mount: Deactivated successfully. Feb 23 03:35:58 localhost podman[91539]: 2026-02-23 08:35:58.976106432 +0000 UTC m=+0.134667608 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:35:58 localhost podman[91538]: 2026-02-23 08:35:58.938938315 +0000 UTC m=+0.100176707 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:35:59 localhost podman[91540]: 2026-02-23 08:35:59.061828026 +0000 UTC m=+0.216710285 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 23 03:35:59 localhost podman[91540]: 2026-02-23 08:35:59.090425348 +0000 UTC m=+0.245307667 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1) Feb 23 03:35:59 localhost podman[91538]: 2026-02-23 08:35:59.090816301 +0000 UTC m=+0.252054693 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=iscsid, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:35:59 localhost podman[91543]: 2026-02-23 08:35:58.964950706 +0000 UTC m=+0.112983537 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:35:59 localhost podman[91543]: 2026-02-23 08:35:59.119731774 +0000 UTC m=+0.267764585 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, version=17.1.13, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Feb 23 03:35:59 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:35:59 localhost podman[91539]: 2026-02-23 08:35:59.175469502 +0000 UTC m=+0.334030728 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510) Feb 23 03:35:59 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:35:59 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:35:59 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:35:59 localhost podman[91552]: 2026-02-23 08:35:59.041101915 +0000 UTC m=+0.186242564 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:35:59 localhost podman[91552]: 2026-02-23 08:35:59.327504162 +0000 UTC m=+0.472644781 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:35:59 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:36:01 localhost systemd[1]: tmp-crun.3mUafh.mount: Deactivated successfully. Feb 23 03:36:01 localhost podman[91653]: 2026-02-23 08:36:01.928492808 +0000 UTC m=+0.098567306 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=nova_migration_target, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.) Feb 23 03:36:02 localhost podman[91653]: 2026-02-23 08:36:02.310372442 +0000 UTC m=+0.480446950 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, vcs-type=git, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:36:02 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:36:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:36:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:36:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:36:03 localhost podman[91680]: 2026-02-23 08:36:03.922091274 +0000 UTC m=+0.091608114 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true) Feb 23 03:36:03 localhost systemd[1]: tmp-crun.bkVTTI.mount: Deactivated successfully. Feb 23 03:36:03 localhost podman[91679]: 2026-02-23 08:36:03.982945556 +0000 UTC m=+0.157721413 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public) Feb 23 03:36:04 localhost podman[91678]: 2026-02-23 08:36:04.029572164 +0000 UTC m=+0.205473908 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510) Feb 23 03:36:04 localhost podman[91678]: 2026-02-23 08:36:04.056170182 +0000 UTC m=+0.232071896 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, build-date=2026-01-12T22:36:40Z, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:36:04 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:36:04 localhost podman[91679]: 2026-02-23 08:36:04.081524051 +0000 UTC m=+0.256299948 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:36:04 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:36:04 localhost podman[91680]: 2026-02-23 08:36:04.129480361 +0000 UTC m=+0.298997241 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git) Feb 23 03:36:04 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:36:04 localhost sshd[91755]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:36:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:36:25 localhost systemd[1]: tmp-crun.PwhsjV.mount: Deactivated successfully. Feb 23 03:36:25 localhost podman[91835]: 2026-02-23 08:36:25.926086216 +0000 UTC m=+0.096081516 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:36:25 localhost podman[91835]: 2026-02-23 08:36:25.941363473 +0000 UTC m=+0.111358783 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, release=1766032510, com.redhat.component=openstack-collectd-container) Feb 23 03:36:25 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:36:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:36:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:36:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:36:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:36:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:36:29 localhost podman[91856]: 2026-02-23 08:36:29.922253476 +0000 UTC m=+0.091255412 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.) Feb 23 03:36:29 localhost podman[91856]: 2026-02-23 08:36:29.935080995 +0000 UTC m=+0.104082901 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true) Feb 23 03:36:29 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:36:29 localhost podman[91865]: 2026-02-23 08:36:29.985444062 +0000 UTC m=+0.138099307 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:36:29 localhost podman[91864]: 2026-02-23 08:36:29.938091731 +0000 UTC m=+0.092152801 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:36:30 localhost podman[91865]: 2026-02-23 08:36:30.037195013 +0000 UTC m=+0.189850278 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64) Feb 23 03:36:30 localhost podman[91864]: 2026-02-23 08:36:30.071932752 +0000 UTC m=+0.225993812 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond) Feb 23 03:36:30 localhost podman[91857]: 2026-02-23 08:36:30.080506075 +0000 UTC m=+0.243805099 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64) Feb 23 03:36:30 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:36:30 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:36:30 localhost podman[91858]: 2026-02-23 08:36:30.053473512 +0000 UTC m=+0.212725738 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 23 03:36:30 localhost podman[91858]: 2026-02-23 08:36:30.132772132 +0000 UTC m=+0.292024388 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64) Feb 23 03:36:30 localhost podman[91857]: 2026-02-23 08:36:30.140308003 +0000 UTC m=+0.303607007 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:36:30 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:36:30 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:36:32 localhost podman[91970]: 2026-02-23 08:36:32.924944116 +0000 UTC m=+0.091268962 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:36:33 localhost podman[91970]: 2026-02-23 08:36:33.336473056 +0000 UTC m=+0.502797912 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com) Feb 23 03:36:33 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:36:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:36:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:36:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:36:34 localhost systemd[1]: tmp-crun.3nluua.mount: Deactivated successfully. Feb 23 03:36:34 localhost podman[91993]: 2026-02-23 08:36:34.933162788 +0000 UTC m=+0.109020009 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:36:34 localhost systemd[1]: tmp-crun.h1XlOs.mount: Deactivated successfully. Feb 23 03:36:34 localhost podman[91993]: 2026-02-23 08:36:34.982051049 +0000 UTC m=+0.157908260 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:36:34 localhost podman[91995]: 2026-02-23 08:36:34.992580454 +0000 UTC m=+0.158823238 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:36:34 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:36:35 localhost podman[91994]: 2026-02-23 08:36:35.031169095 +0000 UTC m=+0.202800391 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:36:35 localhost podman[91994]: 2026-02-23 08:36:35.108642327 +0000 UTC m=+0.280273603 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, release=1766032510) Feb 23 03:36:35 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:36:35 localhost podman[91995]: 2026-02-23 08:36:35.23534935 +0000 UTC m=+0.401592074 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:36:35 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:36:42 localhost sshd[92065]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:36:42 localhost sshd[92066]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:36:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:36:56 localhost podman[92069]: 2026-02-23 08:36:56.926712389 +0000 UTC m=+0.096744168 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:36:56 localhost podman[92069]: 2026-02-23 08:36:56.938302548 +0000 UTC m=+0.108334277 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:36:56 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:37:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:37:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:37:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:37:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:37:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:37:00 localhost systemd[1]: tmp-crun.v4zxiY.mount: Deactivated successfully. Feb 23 03:37:00 localhost podman[92088]: 2026-02-23 08:37:00.923692047 +0000 UTC m=+0.089908226 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, version=17.1.13, batch=17.1_20260112.1) Feb 23 03:37:00 localhost podman[92090]: 2026-02-23 08:37:00.993964243 +0000 UTC m=+0.154400343 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-cron-container) Feb 23 03:37:01 localhost podman[92090]: 2026-02-23 08:37:01.009291298 +0000 UTC m=+0.169727438 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:37:01 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:37:01 localhost podman[92089]: 2026-02-23 08:37:00.969767733 +0000 UTC m=+0.131538805 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, release=1766032510, vcs-type=git) Feb 23 03:37:01 localhost podman[92089]: 2026-02-23 08:37:01.049586206 +0000 UTC m=+0.211357308 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64) Feb 23 03:37:01 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:37:01 localhost podman[92093]: 2026-02-23 08:37:01.102616669 +0000 UTC m=+0.256169866 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=nova_compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z) Feb 23 03:37:01 localhost podman[92088]: 2026-02-23 08:37:01.107108147 +0000 UTC m=+0.273324416 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:37:01 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:37:01 localhost podman[92087]: 2026-02-23 08:37:01.201183591 +0000 UTC m=+0.370090843 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=) Feb 23 03:37:01 localhost podman[92087]: 2026-02-23 08:37:01.213358218 +0000 UTC m=+0.382265460 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, version=17.1.13, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Feb 23 03:37:01 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:37:01 localhost podman[92093]: 2026-02-23 08:37:01.227194917 +0000 UTC m=+0.380748124 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, release=1766032510) Feb 23 03:37:01 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:37:03 localhost podman[92199]: 2026-02-23 08:37:03.906690479 +0000 UTC m=+0.080228416 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:37:04 localhost podman[92199]: 2026-02-23 08:37:04.328858354 +0000 UTC m=+0.502396241 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.buildah.version=1.41.5) Feb 23 03:37:04 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:37:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:37:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:37:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:37:05 localhost podman[92222]: 2026-02-23 08:37:05.911382119 +0000 UTC m=+0.085677954 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:37:05 localhost systemd[1]: tmp-crun.JgkojT.mount: Deactivated successfully. Feb 23 03:37:05 localhost podman[92223]: 2026-02-23 08:37:05.976528386 +0000 UTC m=+0.144757093 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, architecture=x86_64, container_name=ovn_metadata_agent, tcib_managed=true, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 23 03:37:06 localhost podman[92224]: 2026-02-23 08:37:06.022598414 +0000 UTC m=+0.189395208 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, container_name=metrics_qdr, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, release=1766032510, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Feb 23 03:37:06 localhost podman[92222]: 2026-02-23 08:37:06.044378219 +0000 UTC m=+0.218674044 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ovn_controller, tcib_managed=true) Feb 23 03:37:06 localhost podman[92223]: 2026-02-23 08:37:06.053440309 +0000 UTC m=+0.221668976 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:37:06 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:37:06 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:37:06 localhost podman[92224]: 2026-02-23 08:37:06.318432086 +0000 UTC m=+0.485228880 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:37:06 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:37:21 localhost sshd[92300]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:37:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:37:27 localhost systemd[1]: tmp-crun.D0wkt6.mount: Deactivated successfully. Feb 23 03:37:27 localhost podman[92379]: 2026-02-23 08:37:27.647226039 +0000 UTC m=+0.093835687 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 23 03:37:27 localhost podman[92379]: 2026-02-23 08:37:27.660916253 +0000 UTC m=+0.107525931 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:37:27 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:37:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:37:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:37:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:37:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:37:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:37:31 localhost systemd[1]: tmp-crun.befOUE.mount: Deactivated successfully. Feb 23 03:37:31 localhost podman[92401]: 2026-02-23 08:37:31.933317871 +0000 UTC m=+0.101002799 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team) Feb 23 03:37:31 localhost podman[92401]: 2026-02-23 08:37:31.967253582 +0000 UTC m=+0.134938510 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, url=https://www.redhat.com, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:37:31 localhost podman[92400]: 2026-02-23 08:37:31.982255008 +0000 UTC m=+0.149493902 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, distribution-scope=public) Feb 23 03:37:31 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:37:32 localhost podman[92402]: 2026-02-23 08:37:32.025056573 +0000 UTC m=+0.188581892 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:37:32 localhost podman[92400]: 2026-02-23 08:37:32.04855409 +0000 UTC m=+0.215792994 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:37:32 localhost podman[92403]: 2026-02-23 08:37:32.084823534 +0000 UTC m=+0.245721102 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team) Feb 23 03:37:32 localhost podman[92402]: 2026-02-23 08:37:32.112677707 +0000 UTC m=+0.276203006 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:37:32 localhost podman[92403]: 2026-02-23 08:37:32.121308234 +0000 UTC m=+0.282205782 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, container_name=logrotate_crond, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vcs-type=git, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container) Feb 23 03:37:32 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:37:32 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:37:32 localhost podman[92404]: 2026-02-23 08:37:32.134526454 +0000 UTC m=+0.292230932 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step5, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=) Feb 23 03:37:32 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:37:32 localhost podman[92404]: 2026-02-23 08:37:32.169404003 +0000 UTC m=+0.327108491 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true) Feb 23 03:37:32 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:37:32 localhost systemd[1]: tmp-crun.DOzmyK.mount: Deactivated successfully. Feb 23 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:37:34 localhost systemd[1]: tmp-crun.D3LYGv.mount: Deactivated successfully. Feb 23 03:37:34 localhost podman[92516]: 2026-02-23 08:37:34.920602415 +0000 UTC m=+0.093777115 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:37:35 localhost podman[92516]: 2026-02-23 08:37:35.333403611 +0000 UTC m=+0.506578381 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Feb 23 03:37:35 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:37:35 localhost sshd[92538]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:37:36 localhost systemd[1]: tmp-crun.St56Ni.mount: Deactivated successfully. Feb 23 03:37:36 localhost podman[92540]: 2026-02-23 08:37:36.422401451 +0000 UTC m=+0.085236762 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., container_name=ovn_controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4) Feb 23 03:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:37:36 localhost podman[92541]: 2026-02-23 08:37:36.443738841 +0000 UTC m=+0.100132242 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, container_name=ovn_metadata_agent, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z) Feb 23 03:37:36 localhost podman[92540]: 2026-02-23 08:37:36.453199545 +0000 UTC m=+0.116034806 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, container_name=ovn_controller, distribution-scope=public, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:37:36 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:37:36 localhost podman[92541]: 2026-02-23 08:37:36.495223976 +0000 UTC m=+0.151617367 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:37:36 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:37:36 localhost podman[92573]: 2026-02-23 08:37:36.519420585 +0000 UTC m=+0.074602911 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 23 03:37:36 localhost podman[92573]: 2026-02-23 08:37:36.741628718 +0000 UTC m=+0.296811024 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:37:36 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:37:47 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:37:47 localhost recover_tripleo_nova_virtqemud[92618]: 61982 Feb 23 03:37:47 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:37:47 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:37:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:37:57 localhost podman[92619]: 2026-02-23 08:37:57.907682338 +0000 UTC m=+0.084040364 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, version=17.1.13, container_name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:37:57 localhost podman[92619]: 2026-02-23 08:37:57.948371768 +0000 UTC m=+0.124729764 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, version=17.1.13, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 23 03:37:57 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:38:01 localhost sshd[92640]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:38:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:38:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:38:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:38:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:38:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:38:02 localhost systemd[1]: tmp-crun.Dt9uLk.mount: Deactivated successfully. Feb 23 03:38:02 localhost podman[92642]: 2026-02-23 08:38:02.924706708 +0000 UTC m=+0.096015464 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, container_name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z) Feb 23 03:38:02 localhost podman[92642]: 2026-02-23 08:38:02.96318153 +0000 UTC m=+0.134490286 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team) Feb 23 03:38:02 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:38:02 localhost podman[92643]: 2026-02-23 08:38:02.987771861 +0000 UTC m=+0.154151975 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:38:03 localhost podman[92650]: 2026-02-23 08:38:03.031127724 +0000 UTC m=+0.191002256 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:38:03 localhost podman[92643]: 2026-02-23 08:38:03.046381916 +0000 UTC m=+0.212762040 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 23 03:38:03 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:38:03 localhost podman[92650]: 2026-02-23 08:38:03.070462893 +0000 UTC m=+0.230337415 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5) Feb 23 03:38:03 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:38:03 localhost podman[92644]: 2026-02-23 08:38:03.136666854 +0000 UTC m=+0.299671143 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:38:03 localhost podman[92644]: 2026-02-23 08:38:03.195389842 +0000 UTC m=+0.358394121 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:38:03 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:38:03 localhost podman[92645]: 2026-02-23 08:38:03.195934989 +0000 UTC m=+0.354959365 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, release=1766032510) Feb 23 03:38:03 localhost podman[92645]: 2026-02-23 08:38:03.283121979 +0000 UTC m=+0.442146345 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=) Feb 23 03:38:03 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:38:03 localhost systemd[1]: tmp-crun.x44AjN.mount: Deactivated successfully. Feb 23 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:38:05 localhost podman[92758]: 2026-02-23 08:38:05.912066774 +0000 UTC m=+0.083527367 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:38:06 localhost podman[92758]: 2026-02-23 08:38:06.259350251 +0000 UTC m=+0.430810854 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_migration_target) Feb 23 03:38:06 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:38:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:38:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:38:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:38:06 localhost systemd[1]: tmp-crun.nsjypA.mount: Deactivated successfully. Feb 23 03:38:06 localhost podman[92783]: 2026-02-23 08:38:06.958985971 +0000 UTC m=+0.132007190 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, vcs-type=git) Feb 23 03:38:07 localhost podman[92784]: 2026-02-23 08:38:07.004358176 +0000 UTC m=+0.172522964 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, version=17.1.13, config_id=tripleo_step1, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:38:07 localhost podman[92782]: 2026-02-23 08:38:07.046827311 +0000 UTC m=+0.223289246 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, release=1766032510, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:38:07 localhost podman[92783]: 2026-02-23 08:38:07.070037511 +0000 UTC m=+0.243058720 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Feb 23 03:38:07 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:38:07 localhost podman[92782]: 2026-02-23 08:38:07.103399814 +0000 UTC m=+0.279861729 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, container_name=ovn_controller, io.buildah.version=1.41.5, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git) Feb 23 03:38:07 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:38:07 localhost podman[92784]: 2026-02-23 08:38:07.241260104 +0000 UTC m=+0.409424862 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:38:07 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:38:21 localhost sshd[92858]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:38:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:38:28 localhost podman[92938]: 2026-02-23 08:38:28.920648255 +0000 UTC m=+0.090247837 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:38:28 localhost podman[92938]: 2026-02-23 08:38:28.937104314 +0000 UTC m=+0.106703876 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, architecture=x86_64) Feb 23 03:38:28 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:38:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:38:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:38:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:38:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:38:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:38:33 localhost systemd[1]: tmp-crun.i1mqyw.mount: Deactivated successfully. Feb 23 03:38:33 localhost podman[93006]: 2026-02-23 08:38:33.925029725 +0000 UTC m=+0.099333327 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 23 03:38:33 localhost podman[93006]: 2026-02-23 08:38:33.969535523 +0000 UTC m=+0.143839085 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, container_name=iscsid, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:38:33 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:38:34 localhost podman[93009]: 2026-02-23 08:38:33.968926415 +0000 UTC m=+0.135091316 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, version=17.1.13, container_name=logrotate_crond, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.) Feb 23 03:38:34 localhost podman[93008]: 2026-02-23 08:38:34.016905671 +0000 UTC m=+0.184948030 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:38:34 localhost podman[93007]: 2026-02-23 08:38:34.085097952 +0000 UTC m=+0.254494473 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5) Feb 23 03:38:34 localhost podman[93014]: 2026-02-23 08:38:34.130831599 +0000 UTC m=+0.293111189 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:38:34 localhost podman[93007]: 2026-02-23 08:38:34.1405597 +0000 UTC m=+0.309956171 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vendor=Red Hat, Inc.) Feb 23 03:38:34 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:38:34 localhost podman[93008]: 2026-02-23 08:38:34.155328028 +0000 UTC m=+0.323370367 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:38:34 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:38:34 localhost podman[93014]: 2026-02-23 08:38:34.18900244 +0000 UTC m=+0.351281940 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, config_id=tripleo_step5, container_name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible) Feb 23 03:38:34 localhost podman[93009]: 2026-02-23 08:38:34.210649261 +0000 UTC m=+0.376814192 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible) Feb 23 03:38:34 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:38:34 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:38:36 localhost podman[93119]: 2026-02-23 08:38:36.897581613 +0000 UTC m=+0.072035203 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible) Feb 23 03:38:37 localhost podman[93119]: 2026-02-23 08:38:37.261570246 +0000 UTC m=+0.436023826 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:38:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:38:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:38:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:38:37 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:38:37 localhost systemd[1]: tmp-crun.LrRK2s.mount: Deactivated successfully. Feb 23 03:38:37 localhost podman[93140]: 2026-02-23 08:38:37.372369979 +0000 UTC m=+0.082590190 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510) Feb 23 03:38:37 localhost podman[93142]: 2026-02-23 08:38:37.494989016 +0000 UTC m=+0.206399363 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Feb 23 03:38:37 localhost podman[93140]: 2026-02-23 08:38:37.50060525 +0000 UTC m=+0.210825521 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4) Feb 23 03:38:37 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:38:37 localhost podman[93141]: 2026-02-23 08:38:37.464630315 +0000 UTC m=+0.175993941 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 03:38:37 localhost podman[93141]: 2026-02-23 08:38:37.545972876 +0000 UTC m=+0.257336492 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 23 03:38:37 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:38:37 localhost podman[93142]: 2026-02-23 08:38:37.718441197 +0000 UTC m=+0.429851604 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:38:37 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:38:37 localhost systemd[1]: tmp-crun.PyzsBH.mount: Deactivated successfully. Feb 23 03:38:42 localhost sshd[93218]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:38:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:38:59 localhost podman[93220]: 2026-02-23 08:38:59.906377551 +0000 UTC m=+0.073228095 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git) Feb 23 03:38:59 localhost podman[93220]: 2026-02-23 08:38:59.920201946 +0000 UTC m=+0.087052430 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5) Feb 23 03:38:59 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:39:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:39:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:39:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:39:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:39:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:39:04 localhost systemd[1]: tmp-crun.fe7jZL.mount: Deactivated successfully. Feb 23 03:39:04 localhost podman[93247]: 2026-02-23 08:39:04.951405104 +0000 UTC m=+0.113463100 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, container_name=logrotate_crond, tcib_managed=true, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 23 03:39:04 localhost podman[93241]: 2026-02-23 08:39:04.96902828 +0000 UTC m=+0.138666435 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 23 03:39:04 localhost podman[93240]: 2026-02-23 08:39:04.981913494 +0000 UTC m=+0.156655970 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, container_name=iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:39:04 localhost podman[93240]: 2026-02-23 08:39:04.98906316 +0000 UTC m=+0.163805666 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5) Feb 23 03:39:04 localhost podman[93241]: 2026-02-23 08:39:04.999201128 +0000 UTC m=+0.168839303 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4) Feb 23 03:39:05 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:39:05 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:39:05 localhost podman[93247]: 2026-02-23 08:39:05.087209898 +0000 UTC m=+0.249267844 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:39:05 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:39:05 localhost podman[93251]: 2026-02-23 08:39:05.138424589 +0000 UTC m=+0.298209143 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:39:05 localhost podman[93242]: 2026-02-23 08:39:05.179257164 +0000 UTC m=+0.346179883 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5) Feb 23 03:39:05 localhost podman[93251]: 2026-02-23 08:39:05.199299775 +0000 UTC m=+0.359084319 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 23 03:39:05 localhost podman[93242]: 2026-02-23 08:39:05.209595678 +0000 UTC m=+0.376518397 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:39:05 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:39:05 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:39:07 localhost systemd[1]: tmp-crun.UtyNHQ.mount: Deactivated successfully. Feb 23 03:39:07 localhost podman[93357]: 2026-02-23 08:39:07.932539221 +0000 UTC m=+0.106218932 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 23 03:39:08 localhost podman[93360]: 2026-02-23 08:39:08.030029758 +0000 UTC m=+0.195709608 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Feb 23 03:39:08 localhost podman[93359]: 2026-02-23 08:39:07.9913142 +0000 UTC m=+0.158254810 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:39:08 localhost podman[93359]: 2026-02-23 08:39:08.080361222 +0000 UTC m=+0.247301802 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13) Feb 23 03:39:08 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:39:08 localhost podman[93358]: 2026-02-23 08:39:08.132924066 +0000 UTC m=+0.303246332 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:39:08 localhost podman[93358]: 2026-02-23 08:39:08.159180752 +0000 UTC m=+0.329503008 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, distribution-scope=public) Feb 23 03:39:08 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:39:08 localhost podman[93360]: 2026-02-23 08:39:08.230229348 +0000 UTC m=+0.395909208 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Feb 23 03:39:08 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:39:08 localhost podman[93357]: 2026-02-23 08:39:08.343371827 +0000 UTC m=+0.517051528 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public) Feb 23 03:39:08 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:39:08 localhost systemd[1]: tmp-crun.dxyc8V.mount: Deactivated successfully. Feb 23 03:39:09 localhost sshd[93452]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=45778 SEQ=0 ACK=2193559953 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:39:25 localhost sshd[93455]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:25 localhost sshd[93457]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:26 localhost sshd[93459]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:27 localhost sshd[93461]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:29 localhost sshd[93463]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:39:30 localhost systemd[1]: tmp-crun.hw13LJ.mount: Deactivated successfully. Feb 23 03:39:30 localhost podman[93465]: 2026-02-23 08:39:30.932077708 +0000 UTC m=+0.102041992 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:39:30 localhost podman[93465]: 2026-02-23 08:39:30.947308307 +0000 UTC m=+0.117272671 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, release=1766032510, batch=17.1_20260112.1, container_name=collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:39:30 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:39:31 localhost sshd[93486]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:32 localhost sshd[93488]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:34 localhost sshd[93568]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:34 localhost podman[93626]: Feb 23 03:39:34 localhost podman[93626]: 2026-02-23 08:39:34.873923742 +0000 UTC m=+0.080487053 container create 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, RELEASE=main, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_BRANCH=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347) Feb 23 03:39:34 localhost systemd[1]: Started libpod-conmon-14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3.scope. Feb 23 03:39:34 localhost systemd[1]: Started libcrun container. Feb 23 03:39:34 localhost podman[93626]: 2026-02-23 08:39:34.840884243 +0000 UTC m=+0.047447614 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 03:39:34 localhost podman[93626]: 2026-02-23 08:39:34.951979298 +0000 UTC m=+0.158542619 container init 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 03:39:34 localhost podman[93626]: 2026-02-23 08:39:34.969399256 +0000 UTC m=+0.175962567 container start 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, ceph=True, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 03:39:34 localhost podman[93626]: 2026-02-23 08:39:34.969783928 +0000 UTC m=+0.176347259 container attach 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, io.buildah.version=1.42.2, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, name=rhceph, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, CEPH_POINT_RELEASE=) Feb 23 03:39:34 localhost quirky_lumiere[93641]: 167 167 Feb 23 03:39:34 localhost systemd[1]: libpod-14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3.scope: Deactivated successfully. Feb 23 03:39:34 localhost podman[93626]: 2026-02-23 08:39:34.974995282 +0000 UTC m=+0.181558603 container died 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, architecture=x86_64, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2) Feb 23 03:39:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:39:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:39:35 localhost podman[93646]: 2026-02-23 08:39:35.084167088 +0000 UTC m=+0.097395566 container remove 14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_lumiere, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, distribution-scope=public, com.redhat.component=rhceph-container, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, release=1770267347, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:39:35 localhost systemd[1]: libpod-conmon-14fae1fcc3ca35c5316b2a7fece2747cb26fe4492594ba0a5075b7988fad06d3.scope: Deactivated successfully. Feb 23 03:39:35 localhost podman[93660]: 2026-02-23 08:39:35.174227451 +0000 UTC m=+0.102572439 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 23 03:39:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:39:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:39:35 localhost podman[93660]: 2026-02-23 08:39:35.23554063 +0000 UTC m=+0.163885538 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, release=1766032510, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, tcib_managed=true) Feb 23 03:39:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:39:35 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:39:35 localhost podman[93716]: Feb 23 03:39:35 localhost podman[93706]: 2026-02-23 08:39:35.327944178 +0000 UTC m=+0.089930831 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=) Feb 23 03:39:35 localhost podman[93716]: 2026-02-23 08:39:35.338722316 +0000 UTC m=+0.087274477 container create d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, vcs-type=git) Feb 23 03:39:35 localhost systemd[1]: Started libpod-conmon-d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05.scope. Feb 23 03:39:35 localhost podman[93716]: 2026-02-23 08:39:35.305693207 +0000 UTC m=+0.054245408 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 03:39:35 localhost systemd[1]: Started libcrun container. Feb 23 03:39:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b724e3f641c64179c2edf4f868097ca71a3efcf7a4817c804ae548c6ae76a7f/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 03:39:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b724e3f641c64179c2edf4f868097ca71a3efcf7a4817c804ae548c6ae76a7f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 03:39:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b724e3f641c64179c2edf4f868097ca71a3efcf7a4817c804ae548c6ae76a7f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 03:39:35 localhost podman[93716]: 2026-02-23 08:39:35.424819535 +0000 UTC m=+0.173371676 container init d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, release=1770267347, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7) Feb 23 03:39:35 localhost podman[93733]: 2026-02-23 08:39:35.427309993 +0000 UTC m=+0.153037596 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z) Feb 23 03:39:35 localhost podman[93716]: 2026-02-23 08:39:35.437120103 +0000 UTC m=+0.185672244 container start d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64) Feb 23 03:39:35 localhost podman[93716]: 2026-02-23 08:39:35.437973119 +0000 UTC m=+0.186525300 container attach d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, distribution-scope=public) Feb 23 03:39:35 localhost podman[93659]: 2026-02-23 08:39:35.237081828 +0000 UTC m=+0.166263082 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.13, release=1766032510, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z) Feb 23 03:39:35 localhost podman[93706]: 2026-02-23 08:39:35.443975798 +0000 UTC m=+0.205962461 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, url=https://www.redhat.com) Feb 23 03:39:35 localhost podman[93733]: 2026-02-23 08:39:35.456448351 +0000 UTC m=+0.182175954 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:39:35 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:39:35 localhost podman[93659]: 2026-02-23 08:39:35.478407861 +0000 UTC m=+0.407589125 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com) Feb 23 03:39:35 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:39:35 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:39:35 localhost podman[93703]: 2026-02-23 08:39:35.396662869 +0000 UTC m=+0.163009880 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5) Feb 23 03:39:35 localhost podman[93703]: 2026-02-23 08:39:35.582414363 +0000 UTC m=+0.348761324 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, release=1766032510, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:39:35 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:39:35 localhost sshd[93806]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:35 localhost systemd[1]: var-lib-containers-storage-overlay-387aafea2448999ae2405fb00529bd6bf8cf6a12bf38716a9e8a06259075d983-merged.mount: Deactivated successfully. Feb 23 03:39:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=58188 SEQ=0 ACK=2144845811 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:39:36 localhost romantic_mendel[93779]: [ Feb 23 03:39:36 localhost romantic_mendel[93779]: { Feb 23 03:39:36 localhost romantic_mendel[93779]: "available": false, Feb 23 03:39:36 localhost romantic_mendel[93779]: "ceph_device": false, Feb 23 03:39:36 localhost romantic_mendel[93779]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 23 03:39:36 localhost romantic_mendel[93779]: "lsm_data": {}, Feb 23 03:39:36 localhost romantic_mendel[93779]: "lvs": [], Feb 23 03:39:36 localhost romantic_mendel[93779]: "path": "/dev/sr0", Feb 23 03:39:36 localhost romantic_mendel[93779]: "rejected_reasons": [ Feb 23 03:39:36 localhost romantic_mendel[93779]: "Insufficient space (<5GB)", Feb 23 03:39:36 localhost romantic_mendel[93779]: "Has a FileSystem" Feb 23 03:39:36 localhost romantic_mendel[93779]: ], Feb 23 03:39:36 localhost romantic_mendel[93779]: "sys_api": { Feb 23 03:39:36 localhost romantic_mendel[93779]: "actuators": null, Feb 23 03:39:36 localhost romantic_mendel[93779]: "device_nodes": "sr0", Feb 23 03:39:36 localhost romantic_mendel[93779]: "human_readable_size": "482.00 KB", Feb 23 03:39:36 localhost romantic_mendel[93779]: "id_bus": "ata", Feb 23 03:39:36 localhost romantic_mendel[93779]: "model": "QEMU DVD-ROM", Feb 23 03:39:36 localhost romantic_mendel[93779]: "nr_requests": "2", Feb 23 03:39:36 localhost romantic_mendel[93779]: "partitions": {}, Feb 23 03:39:36 localhost romantic_mendel[93779]: "path": "/dev/sr0", Feb 23 03:39:36 localhost romantic_mendel[93779]: "removable": "1", Feb 23 03:39:36 localhost romantic_mendel[93779]: "rev": "2.5+", Feb 23 03:39:36 localhost romantic_mendel[93779]: "ro": "0", Feb 23 03:39:36 localhost romantic_mendel[93779]: "rotational": "1", Feb 23 03:39:36 localhost romantic_mendel[93779]: "sas_address": "", Feb 23 03:39:36 localhost romantic_mendel[93779]: "sas_device_handle": "", Feb 23 03:39:36 localhost romantic_mendel[93779]: "scheduler_mode": "mq-deadline", Feb 23 03:39:36 localhost romantic_mendel[93779]: "sectors": 0, Feb 23 03:39:36 localhost romantic_mendel[93779]: "sectorsize": "2048", Feb 23 03:39:36 localhost romantic_mendel[93779]: "size": 493568.0, Feb 23 03:39:36 localhost romantic_mendel[93779]: "support_discard": "0", Feb 23 03:39:36 localhost romantic_mendel[93779]: "type": "disk", Feb 23 03:39:36 localhost romantic_mendel[93779]: "vendor": "QEMU" Feb 23 03:39:36 localhost romantic_mendel[93779]: } Feb 23 03:39:36 localhost romantic_mendel[93779]: } Feb 23 03:39:36 localhost romantic_mendel[93779]: ] Feb 23 03:39:36 localhost systemd[1]: libpod-d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05.scope: Deactivated successfully. Feb 23 03:39:36 localhost systemd[1]: libpod-d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05.scope: Consumed 1.122s CPU time. Feb 23 03:39:36 localhost podman[95659]: 2026-02-23 08:39:36.620857877 +0000 UTC m=+0.056203230 container died d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, ceph=True, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, CEPH_POINT_RELEASE=, release=1770267347, GIT_CLEAN=True, distribution-scope=public, vcs-type=git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 03:39:36 localhost systemd[1]: tmp-crun.q7i8qj.mount: Deactivated successfully. Feb 23 03:39:36 localhost systemd[1]: var-lib-containers-storage-overlay-6b724e3f641c64179c2edf4f868097ca71a3efcf7a4817c804ae548c6ae76a7f-merged.mount: Deactivated successfully. Feb 23 03:39:36 localhost podman[95659]: 2026-02-23 08:39:36.665169281 +0000 UTC m=+0.100514594 container remove d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_mendel, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , RELEASE=main, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 03:39:36 localhost systemd[1]: libpod-conmon-d28267e237da207d15f8d7d0a908151398f2c09403693babac245c71652efb05.scope: Deactivated successfully. Feb 23 03:39:37 localhost sshd[95674]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:39:38 localhost systemd[1]: tmp-crun.obe6u4.mount: Deactivated successfully. Feb 23 03:39:38 localhost podman[95676]: 2026-02-23 08:39:38.476996798 +0000 UTC m=+0.150478846 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.5) Feb 23 03:39:38 localhost systemd[1]: tmp-crun.VsUxHm.mount: Deactivated successfully. Feb 23 03:39:38 localhost podman[95676]: 2026-02-23 08:39:38.539220825 +0000 UTC m=+0.212702853 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:39:38 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:39:38 localhost podman[95677]: 2026-02-23 08:39:38.523351496 +0000 UTC m=+0.196246326 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Feb 23 03:39:38 localhost sshd[95774]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:38 localhost podman[95678]: 2026-02-23 08:39:38.440544311 +0000 UTC m=+0.112016416 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, release=1766032510, vcs-type=git) Feb 23 03:39:38 localhost podman[95717]: 2026-02-23 08:39:38.541077254 +0000 UTC m=+0.094250747 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vcs-type=git, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true) Feb 23 03:39:38 localhost podman[95677]: 2026-02-23 08:39:38.603783477 +0000 UTC m=+0.276678307 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc.) Feb 23 03:39:38 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:39:38 localhost podman[95678]: 2026-02-23 08:39:38.702320337 +0000 UTC m=+0.373792372 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, container_name=metrics_qdr, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:39:38 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:39:38 localhost podman[95717]: 2026-02-23 08:39:38.954985397 +0000 UTC m=+0.508158940 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:39:38 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:39:40 localhost sshd[95792]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:41 localhost sshd[95794]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:42 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:39:42 localhost recover_tripleo_nova_virtqemud[95798]: 61982 Feb 23 03:39:42 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:39:42 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:39:43 localhost sshd[95799]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:44 localhost sshd[95801]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:46 localhost sshd[95803]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:47 localhost sshd[95805]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:49 localhost sshd[95807]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:51 localhost sshd[95809]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:52 localhost sshd[95811]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:54 localhost sshd[95813]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:56 localhost sshd[95815]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:57 localhost sshd[95817]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:58 localhost sshd[95819]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:39:59 localhost sshd[95821]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:00 localhost sshd[95822]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:40:01 localhost systemd[1]: tmp-crun.chWVWC.mount: Deactivated successfully. Feb 23 03:40:01 localhost podman[95825]: 2026-02-23 08:40:01.287724095 +0000 UTC m=+0.092225633 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:40:01 localhost podman[95825]: 2026-02-23 08:40:01.303288925 +0000 UTC m=+0.107790513 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:40:01 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:40:02 localhost sshd[95846]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:04 localhost sshd[95848]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:40:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:40:05 localhost systemd[1]: tmp-crun.7hkJgD.mount: Deactivated successfully. Feb 23 03:40:05 localhost podman[95850]: 2026-02-23 08:40:05.831713384 +0000 UTC m=+0.102107243 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:40:05 localhost systemd[1]: tmp-crun.kRWCa3.mount: Deactivated successfully. Feb 23 03:40:05 localhost podman[95850]: 2026-02-23 08:40:05.869318078 +0000 UTC m=+0.139711957 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:40:05 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:40:05 localhost podman[95852]: 2026-02-23 08:40:05.927331204 +0000 UTC m=+0.191310471 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:40:05 localhost podman[95851]: 2026-02-23 08:40:05.893807049 +0000 UTC m=+0.162749063 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git) Feb 23 03:40:05 localhost podman[95857]: 2026-02-23 08:40:05.846900163 +0000 UTC m=+0.103402685 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Feb 23 03:40:05 localhost sshd[95958]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:05 localhost podman[95851]: 2026-02-23 08:40:05.978305158 +0000 UTC m=+0.247247132 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13) Feb 23 03:40:05 localhost podman[95857]: 2026-02-23 08:40:05.98631401 +0000 UTC m=+0.242816562 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute) Feb 23 03:40:05 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:40:06 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:40:06 localhost podman[95853]: 2026-02-23 08:40:06.039844924 +0000 UTC m=+0.301372654 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond) Feb 23 03:40:06 localhost podman[95852]: 2026-02-23 08:40:06.048190236 +0000 UTC m=+0.312169523 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public) Feb 23 03:40:06 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:40:06 localhost podman[95853]: 2026-02-23 08:40:06.073282266 +0000 UTC m=+0.334810026 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:40:06 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:40:07 localhost sshd[95973]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:07 localhost sshd[95975]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:08 localhost sshd[95977]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:40:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:40:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:40:08 localhost systemd[1]: tmp-crun.Z2edxv.mount: Deactivated successfully. Feb 23 03:40:08 localhost podman[95980]: 2026-02-23 08:40:08.929625716 +0000 UTC m=+0.096053793 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, url=https://www.redhat.com, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z) Feb 23 03:40:08 localhost podman[95980]: 2026-02-23 08:40:08.991486162 +0000 UTC m=+0.157914279 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z) Feb 23 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:40:09 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:40:09 localhost podman[95978]: 2026-02-23 08:40:09.008045503 +0000 UTC m=+0.176635678 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step4, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git) Feb 23 03:40:09 localhost podman[95978]: 2026-02-23 08:40:09.032002498 +0000 UTC m=+0.200592683 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true) Feb 23 03:40:09 localhost podman[95978]: unhealthy Feb 23 03:40:09 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:40:09 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:40:09 localhost podman[95981]: 2026-02-23 08:40:09.083051394 +0000 UTC m=+0.245993861 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:40:09 localhost podman[96032]: 2026-02-23 08:40:09.162194464 +0000 UTC m=+0.144713434 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com) Feb 23 03:40:09 localhost podman[95981]: 2026-02-23 08:40:09.262411327 +0000 UTC m=+0.425353594 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:40:09 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:40:09 localhost podman[96032]: 2026-02-23 08:40:09.550256714 +0000 UTC m=+0.532775724 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:40:09 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:40:09 localhost systemd[1]: tmp-crun.U8p10R.mount: Deactivated successfully. Feb 23 03:40:10 localhost sshd[96082]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:12 localhost sshd[96084]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:13 localhost sshd[96086]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:15 localhost sshd[96088]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:16 localhost sshd[96090]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:18 localhost sshd[96092]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:20 localhost sshd[96094]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:22 localhost sshd[96096]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:24 localhost sshd[96098]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:25 localhost sshd[96100]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:27 localhost sshd[96102]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:28 localhost sshd[96104]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:29 localhost sshd[96106]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:40:31 localhost podman[96108]: 2026-02-23 08:40:31.573298746 +0000 UTC m=+0.086187643 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:40:31 localhost podman[96108]: 2026-02-23 08:40:31.583774216 +0000 UTC m=+0.096663123 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:10:15Z, version=17.1.13, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 23 03:40:31 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:40:31 localhost sshd[96129]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:33 localhost sshd[96131]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:34 localhost sshd[96133]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:40:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:40:36 localhost podman[96136]: 2026-02-23 08:40:36.433889357 +0000 UTC m=+0.093355108 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, distribution-scope=public, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Feb 23 03:40:36 localhost podman[96138]: 2026-02-23 08:40:36.497248181 +0000 UTC m=+0.148896736 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z) Feb 23 03:40:36 localhost podman[96136]: 2026-02-23 08:40:36.524342853 +0000 UTC m=+0.183808674 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:40:36 localhost podman[96135]: 2026-02-23 08:40:36.533477441 +0000 UTC m=+0.195154352 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git) Feb 23 03:40:36 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:40:36 localhost sshd[96233]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:36 localhost podman[96135]: 2026-02-23 08:40:36.550401593 +0000 UTC m=+0.212078494 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3) Feb 23 03:40:36 localhost podman[96144]: 2026-02-23 08:40:36.459791973 +0000 UTC m=+0.106033358 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, container_name=nova_compute) Feb 23 03:40:36 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:40:36 localhost podman[96138]: 2026-02-23 08:40:36.580740498 +0000 UTC m=+0.232389093 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:40:36 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:40:36 localhost podman[96144]: 2026-02-23 08:40:36.594555193 +0000 UTC m=+0.240796618 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, distribution-scope=public, container_name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:40:36 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:40:36 localhost podman[96137]: 2026-02-23 08:40:36.65106936 +0000 UTC m=+0.306436084 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi) Feb 23 03:40:36 localhost podman[96137]: 2026-02-23 08:40:36.711339786 +0000 UTC m=+0.366706479 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64) Feb 23 03:40:36 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:40:37 localhost sshd[96255]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:40:39 localhost podman[96257]: 2026-02-23 08:40:39.298084875 +0000 UTC m=+0.084464368 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, vcs-type=git, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:40:39 localhost podman[96258]: 2026-02-23 08:40:39.354797859 +0000 UTC m=+0.137738374 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public) Feb 23 03:40:39 localhost podman[96257]: 2026-02-23 08:40:39.360188419 +0000 UTC m=+0.146567872 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Feb 23 03:40:39 localhost podman[96257]: unhealthy Feb 23 03:40:39 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:40:39 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:40:39 localhost podman[96258]: 2026-02-23 08:40:39.431281556 +0000 UTC m=+0.214222081 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_id=tripleo_step4, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:40:39 localhost sshd[96320]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:39 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:40:39 localhost podman[96288]: 2026-02-23 08:40:39.470692045 +0000 UTC m=+0.155478522 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1) Feb 23 03:40:39 localhost podman[96288]: 2026-02-23 08:40:39.672357081 +0000 UTC m=+0.357143588 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Feb 23 03:40:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:40:39 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:40:39 localhost podman[96342]: 2026-02-23 08:40:39.76609181 +0000 UTC m=+0.068077443 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vcs-type=git, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5) Feb 23 03:40:40 localhost podman[96342]: 2026-02-23 08:40:40.204276827 +0000 UTC m=+0.506262460 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 23 03:40:40 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:40:40 localhost systemd[1]: tmp-crun.qwGcsr.mount: Deactivated successfully. Feb 23 03:40:40 localhost sshd[96457]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:42 localhost sshd[96491]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:42 localhost sshd[96493]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:44 localhost sshd[96495]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:45 localhost sshd[96497]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:46 localhost sshd[96499]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:48 localhost sshd[96501]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:48 localhost sshd[96502]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:50 localhost sshd[96505]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:51 localhost sshd[96507]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:53 localhost sshd[96509]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:54 localhost sshd[96511]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:56 localhost sshd[96513]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:58 localhost sshd[96515]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:40:59 localhost sshd[96517]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:00 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:41:00 localhost recover_tripleo_nova_virtqemud[96520]: 61982 Feb 23 03:41:00 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:41:00 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:41:01 localhost sshd[96521]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:41:01 localhost systemd[1]: tmp-crun.79YQ1d.mount: Deactivated successfully. Feb 23 03:41:01 localhost podman[96523]: 2026-02-23 08:41:01.919446843 +0000 UTC m=+0.087725702 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible) Feb 23 03:41:01 localhost podman[96523]: 2026-02-23 08:41:01.930828181 +0000 UTC m=+0.099107020 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, container_name=collectd, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, config_id=tripleo_step3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:41:01 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:41:02 localhost sshd[96544]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:04 localhost sshd[96546]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:05 localhost sshd[96548]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:41:06 localhost podman[96551]: 2026-02-23 08:41:06.935893764 +0000 UTC m=+0.104079493 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute) Feb 23 03:41:06 localhost systemd[1]: tmp-crun.mEx6L6.mount: Deactivated successfully. Feb 23 03:41:06 localhost podman[96551]: 2026-02-23 08:41:06.983446952 +0000 UTC m=+0.151632711 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 23 03:41:06 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:41:07 localhost podman[96550]: 2026-02-23 08:41:07.038644714 +0000 UTC m=+0.206812973 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1766032510, io.openshift.expose-services=, container_name=iscsid, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container) Feb 23 03:41:07 localhost podman[96553]: 2026-02-23 08:41:06.985410075 +0000 UTC m=+0.144642559 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 23 03:41:07 localhost podman[96552]: 2026-02-23 08:41:07.086999838 +0000 UTC m=+0.249342171 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com) Feb 23 03:41:07 localhost podman[96550]: 2026-02-23 08:41:07.105079436 +0000 UTC m=+0.273247705 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:41:07 localhost podman[96552]: 2026-02-23 08:41:07.116313034 +0000 UTC m=+0.278655317 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public) Feb 23 03:41:07 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:41:07 localhost podman[96553]: 2026-02-23 08:41:07.132368367 +0000 UTC m=+0.291600841 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=logrotate_crond, distribution-scope=public, release=1766032510) Feb 23 03:41:07 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:41:07 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:41:07 localhost podman[96564]: 2026-02-23 08:41:07.192766205 +0000 UTC m=+0.348723473 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_id=tripleo_step5, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute) Feb 23 03:41:07 localhost podman[96564]: 2026-02-23 08:41:07.253436322 +0000 UTC m=+0.409393570 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, config_id=tripleo_step5, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:32:04Z) Feb 23 03:41:07 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:41:07 localhost sshd[96671]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:07 localhost systemd[1]: tmp-crun.E3fNVO.mount: Deactivated successfully. Feb 23 03:41:09 localhost sshd[96673]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:41:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:41:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:41:09 localhost systemd[1]: tmp-crun.VsHCGf.mount: Deactivated successfully. Feb 23 03:41:09 localhost podman[96676]: 2026-02-23 08:41:09.930028689 +0000 UTC m=+0.100454118 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, architecture=x86_64, container_name=ovn_metadata_agent) Feb 23 03:41:09 localhost podman[96675]: 2026-02-23 08:41:09.976260186 +0000 UTC m=+0.146832770 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 23 03:41:10 localhost podman[96675]: 2026-02-23 08:41:10.016914393 +0000 UTC m=+0.187486977 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, batch=17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true) Feb 23 03:41:10 localhost podman[96675]: unhealthy Feb 23 03:41:10 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:41:10 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:41:10 localhost podman[96677]: 2026-02-23 08:41:10.033831243 +0000 UTC m=+0.201437232 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, architecture=x86_64, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:41:10 localhost podman[96676]: 2026-02-23 08:41:10.052162049 +0000 UTC m=+0.222587448 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Feb 23 03:41:10 localhost podman[96676]: unhealthy Feb 23 03:41:10 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:41:10 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:41:10 localhost podman[96677]: 2026-02-23 08:41:10.240895193 +0000 UTC m=+0.408501172 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:10:14Z) Feb 23 03:41:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:41:10 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:41:10 localhost podman[96742]: 2026-02-23 08:41:10.35198156 +0000 UTC m=+0.082411593 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:41:10 localhost sshd[96765]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:10 localhost podman[96742]: 2026-02-23 08:41:10.747644291 +0000 UTC m=+0.478074324 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.13) Feb 23 03:41:10 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:41:12 localhost sshd[96767]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:14 localhost sshd[96769]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:18 localhost sshd[96771]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:20 localhost sshd[96773]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:21 localhost sshd[96775]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:23 localhost sshd[96777]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:24 localhost sshd[96779]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:26 localhost sshd[96781]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:27 localhost sshd[96783]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:27 localhost sshd[96785]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:28 localhost sshd[96787]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:29 localhost sshd[96789]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:30 localhost sshd[96791]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:41:32 localhost systemd[1]: tmp-crun.Q8VtMk.mount: Deactivated successfully. Feb 23 03:41:32 localhost podman[96793]: 2026-02-23 08:41:32.190828963 +0000 UTC m=+0.097144312 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, container_name=collectd) Feb 23 03:41:32 localhost podman[96793]: 2026-02-23 08:41:32.202226537 +0000 UTC m=+0.108541906 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:41:32 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:41:32 localhost sshd[96815]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:34 localhost sshd[96817]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:36 localhost sshd[96819]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:41:37 localhost podman[96821]: 2026-02-23 08:41:37.656177988 +0000 UTC m=+0.093575289 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, version=17.1.13, container_name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git) Feb 23 03:41:37 localhost systemd[1]: tmp-crun.jNbHDe.mount: Deactivated successfully. Feb 23 03:41:37 localhost podman[96821]: 2026-02-23 08:41:37.700405059 +0000 UTC m=+0.137802390 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:41:37 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:41:37 localhost podman[96822]: 2026-02-23 08:41:37.718397944 +0000 UTC m=+0.152875622 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T23:07:47Z) Feb 23 03:41:37 localhost podman[96830]: 2026-02-23 08:41:37.680193024 +0000 UTC m=+0.102467752 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510) Feb 23 03:41:37 localhost podman[96830]: 2026-02-23 08:41:37.759089493 +0000 UTC m=+0.181364241 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:41:37 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:41:37 localhost podman[96824]: 2026-02-23 08:41:37.777386267 +0000 UTC m=+0.203539949 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, version=17.1.13) Feb 23 03:41:37 localhost podman[96822]: 2026-02-23 08:41:37.781429566 +0000 UTC m=+0.215907214 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:41:37 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:41:37 localhost podman[96824]: 2026-02-23 08:41:37.817282081 +0000 UTC m=+0.243435813 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:41:37 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:41:37 localhost podman[96823]: 2026-02-23 08:41:37.871112069 +0000 UTC m=+0.301838787 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, distribution-scope=public, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.) Feb 23 03:41:37 localhost podman[96823]: 2026-02-23 08:41:37.904313539 +0000 UTC m=+0.335040217 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi) Feb 23 03:41:37 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:41:38 localhost sshd[96934]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:40 localhost sshd[96936]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:41:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:41:40 localhost podman[96940]: 2026-02-23 08:41:40.914968671 +0000 UTC m=+0.080427219 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team) Feb 23 03:41:40 localhost podman[96939]: 2026-02-23 08:41:40.985257004 +0000 UTC m=+0.153478960 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:41:41 localhost podman[96939]: 2026-02-23 08:41:41.038078891 +0000 UTC m=+0.206300827 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=ovn_controller, version=17.1.13, batch=17.1_20260112.1) Feb 23 03:41:41 localhost podman[96939]: unhealthy Feb 23 03:41:41 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:41:41 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:41:41 localhost podman[96944]: 2026-02-23 08:41:41.040483388 +0000 UTC m=+0.199192591 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:41:41 localhost podman[96938]: 2026-02-23 08:41:41.107472885 +0000 UTC m=+0.279726990 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:41:41 localhost podman[96940]: 2026-02-23 08:41:41.161352936 +0000 UTC m=+0.326811514 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 23 03:41:41 localhost podman[96940]: unhealthy Feb 23 03:41:41 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:41:41 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:41:41 localhost podman[96944]: 2026-02-23 08:41:41.269537599 +0000 UTC m=+0.428246812 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 23 03:41:41 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:41:41 localhost podman[96938]: 2026-02-23 08:41:41.518331052 +0000 UTC m=+0.690585177 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, architecture=x86_64, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:41:41 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:41:41 localhost sshd[97027]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:43 localhost sshd[97090]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:44 localhost sshd[97107]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:46 localhost sshd[97109]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:47 localhost sshd[97111]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:49 localhost sshd[97113]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:50 localhost sshd[97115]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:52 localhost sshd[97117]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:54 localhost sshd[97119]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:55 localhost sshd[97121]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:57 localhost sshd[97123]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:58 localhost sshd[97125]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:41:59 localhost sshd[97127]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:01 localhost sshd[97129]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:42:02 localhost podman[97131]: 2026-02-23 08:42:02.351822452 +0000 UTC m=+0.095460979 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13) Feb 23 03:42:02 localhost podman[97131]: 2026-02-23 08:42:02.363145203 +0000 UTC m=+0.106783750 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 23 03:42:02 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:42:02 localhost sshd[97151]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:04 localhost sshd[97153]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:05 localhost sshd[97155]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:07 localhost sshd[97157]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:07 localhost sshd[97159]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:42:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:42:07 localhost podman[97162]: 2026-02-23 08:42:07.979389437 +0000 UTC m=+0.146778238 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true) Feb 23 03:42:07 localhost podman[97161]: 2026-02-23 08:42:07.93719709 +0000 UTC m=+0.110361765 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:34:43Z, container_name=iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:42:08 localhost podman[97162]: 2026-02-23 08:42:08.035485218 +0000 UTC m=+0.202873969 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:42:08 localhost podman[97161]: 2026-02-23 08:42:08.070007919 +0000 UTC m=+0.243172544 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3) Feb 23 03:42:08 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:42:08 localhost podman[97198]: 2026-02-23 08:42:08.046691825 +0000 UTC m=+0.100470088 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 23 03:42:08 localhost podman[97198]: 2026-02-23 08:42:08.127118143 +0000 UTC m=+0.180896386 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, config_id=tripleo_step4, distribution-scope=public, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1) Feb 23 03:42:08 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:42:08 localhost podman[97199]: 2026-02-23 08:42:08.143131594 +0000 UTC m=+0.192572508 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 23 03:42:08 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:42:08 localhost podman[97163]: 2026-02-23 08:42:08.078183011 +0000 UTC m=+0.243434983 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, managed_by=tripleo_ansible) Feb 23 03:42:08 localhost podman[97199]: 2026-02-23 08:42:08.201977822 +0000 UTC m=+0.251418716 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, version=17.1.13, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:42:08 localhost podman[97163]: 2026-02-23 08:42:08.209565345 +0000 UTC m=+0.374817267 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z) Feb 23 03:42:08 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:42:08 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:42:08 localhost sshd[97270]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:10 localhost sshd[97272]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:42:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:42:11 localhost podman[97274]: 2026-02-23 08:42:11.720132334 +0000 UTC m=+0.091046317 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z) Feb 23 03:42:11 localhost podman[97275]: 2026-02-23 08:42:11.781831464 +0000 UTC m=+0.146589651 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git) Feb 23 03:42:11 localhost podman[97276]: 2026-02-23 08:42:11.855301959 +0000 UTC m=+0.219535469 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, release=1766032510, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 23 03:42:11 localhost podman[97276]: 2026-02-23 08:42:11.893947103 +0000 UTC m=+0.258180613 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64) Feb 23 03:42:11 localhost podman[97276]: unhealthy Feb 23 03:42:11 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:42:11 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:42:11 localhost podman[97280]: 2026-02-23 08:42:11.945953314 +0000 UTC m=+0.304627137 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:42:11 localhost podman[97275]: 2026-02-23 08:42:11.964818735 +0000 UTC m=+0.329576852 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 03:42:11 localhost podman[97275]: unhealthy Feb 23 03:42:11 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:42:11 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:42:12 localhost podman[97274]: 2026-02-23 08:42:12.088199065 +0000 UTC m=+0.459113058 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:42:12 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:42:12 localhost podman[97280]: 2026-02-23 08:42:12.134999458 +0000 UTC m=+0.493673191 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc.) Feb 23 03:42:12 localhost sshd[97366]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:12 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:42:12 localhost systemd[1]: tmp-crun.jSyDl0.mount: Deactivated successfully. Feb 23 03:42:13 localhost sshd[97368]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:13 localhost sshd[97370]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:15 localhost sshd[97372]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:16 localhost sshd[97374]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:18 localhost sshd[97376]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:20 localhost sshd[97378]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:22 localhost sshd[97380]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:24 localhost sshd[97382]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:28 localhost sshd[97384]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:42:32 localhost systemd[1]: tmp-crun.dH191E.mount: Deactivated successfully. Feb 23 03:42:32 localhost podman[97386]: 2026-02-23 08:42:32.925602078 +0000 UTC m=+0.097557775 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true) Feb 23 03:42:32 localhost podman[97386]: 2026-02-23 08:42:32.936503997 +0000 UTC m=+0.108459654 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, release=1766032510, version=17.1.13, build-date=2026-01-12T22:10:15Z, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5) Feb 23 03:42:32 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:42:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:42:37 localhost recover_tripleo_nova_virtqemud[97407]: 61982 Feb 23 03:42:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:42:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:42:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:42:38 localhost systemd[1]: tmp-crun.Pn5lmX.mount: Deactivated successfully. Feb 23 03:42:38 localhost podman[97408]: 2026-02-23 08:42:38.978034264 +0000 UTC m=+0.149217394 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible) Feb 23 03:42:38 localhost podman[97410]: 2026-02-23 08:42:38.983419106 +0000 UTC m=+0.149903306 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:42:38 localhost podman[97408]: 2026-02-23 08:42:38.987996073 +0000 UTC m=+0.159179213 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-iscsid-container, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:42:38 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:42:39 localhost podman[97409]: 2026-02-23 08:42:39.031542482 +0000 UTC m=+0.198110585 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, version=17.1.13) Feb 23 03:42:39 localhost podman[97411]: 2026-02-23 08:42:38.990302046 +0000 UTC m=+0.150187516 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container) Feb 23 03:42:39 localhost podman[97409]: 2026-02-23 08:42:39.056432787 +0000 UTC m=+0.223000860 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, batch=17.1_20260112.1) Feb 23 03:42:39 localhost podman[97410]: 2026-02-23 08:42:39.057099118 +0000 UTC m=+0.223583328 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:42:39 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:42:39 localhost podman[97417]: 2026-02-23 08:42:39.073240114 +0000 UTC m=+0.232516894 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container) Feb 23 03:42:39 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:42:39 localhost podman[97411]: 2026-02-23 08:42:39.12072977 +0000 UTC m=+0.280615270 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:42:39 localhost podman[97417]: 2026-02-23 08:42:39.131329398 +0000 UTC m=+0.290606188 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step5, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:42:39 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:42:39 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:42:39 localhost systemd[1]: tmp-crun.nAg2iZ.mount: Deactivated successfully. Feb 23 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:42:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:42:42 localhost systemd[1]: tmp-crun.NbVNQc.mount: Deactivated successfully. Feb 23 03:42:42 localhost podman[97527]: 2026-02-23 08:42:42.988743714 +0000 UTC m=+0.147389047 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible) Feb 23 03:42:43 localhost podman[97524]: 2026-02-23 08:42:43.026572071 +0000 UTC m=+0.192493176 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git) Feb 23 03:42:43 localhost podman[97525]: 2026-02-23 08:42:42.941995371 +0000 UTC m=+0.105875581 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 23 03:42:43 localhost podman[97526]: 2026-02-23 08:42:43.077097884 +0000 UTC m=+0.237946477 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.13, io.openshift.expose-services=) Feb 23 03:42:43 localhost podman[97526]: 2026-02-23 08:42:43.09577189 +0000 UTC m=+0.256620533 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64) Feb 23 03:42:43 localhost podman[97526]: unhealthy Feb 23 03:42:43 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:42:43 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:42:43 localhost podman[97525]: 2026-02-23 08:42:43.132285126 +0000 UTC m=+0.296165296 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git) Feb 23 03:42:43 localhost podman[97525]: unhealthy Feb 23 03:42:43 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:42:43 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:42:43 localhost podman[97527]: 2026-02-23 08:42:43.195283407 +0000 UTC m=+0.353928730 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:42:43 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:42:43 localhost podman[97524]: 2026-02-23 08:42:43.441491157 +0000 UTC m=+0.607412272 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, architecture=x86_64, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git) Feb 23 03:42:43 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:42:43 localhost systemd[1]: tmp-crun.wEwyWT.mount: Deactivated successfully. Feb 23 03:42:44 localhost systemd[1]: tmp-crun.kziQbG.mount: Deactivated successfully. Feb 23 03:42:44 localhost podman[97722]: 2026-02-23 08:42:44.920478902 +0000 UTC m=+0.110163548 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True) Feb 23 03:42:45 localhost podman[97722]: 2026-02-23 08:42:45.034191142 +0000 UTC m=+0.223875818 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, version=7, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main) Feb 23 03:42:45 localhost sshd[97821]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:42:54 localhost sshd[97870]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:43:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:43:03 localhost systemd[1]: tmp-crun.eTyoD0.mount: Deactivated successfully. Feb 23 03:43:03 localhost podman[97872]: 2026-02-23 08:43:03.941672655 +0000 UTC m=+0.106621334 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=collectd, version=17.1.13, build-date=2026-01-12T22:10:15Z) Feb 23 03:43:03 localhost podman[97872]: 2026-02-23 08:43:03.958390529 +0000 UTC m=+0.123339198 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3) Feb 23 03:43:03 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:43:09 localhost podman[97893]: 2026-02-23 08:43:09.930996439 +0000 UTC m=+0.097821564 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510) Feb 23 03:43:09 localhost podman[97893]: 2026-02-23 08:43:09.987228224 +0000 UTC m=+0.154053369 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git) Feb 23 03:43:10 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:43:10 localhost podman[97892]: 2026-02-23 08:43:09.989546528 +0000 UTC m=+0.160252087 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:43:10 localhost podman[97894]: 2026-02-23 08:43:10.053053246 +0000 UTC m=+0.216963038 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:43:10 localhost podman[97892]: 2026-02-23 08:43:10.082487745 +0000 UTC m=+0.253193334 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, release=1766032510, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13) Feb 23 03:43:10 localhost podman[97900]: 2026-02-23 08:43:10.093735864 +0000 UTC m=+0.249354582 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true) Feb 23 03:43:10 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:43:10 localhost podman[97900]: 2026-02-23 08:43:10.102316018 +0000 UTC m=+0.257934696 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond) Feb 23 03:43:10 localhost podman[97894]: 2026-02-23 08:43:10.110099676 +0000 UTC m=+0.274009458 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Feb 23 03:43:10 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:43:10 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:43:10 localhost podman[97906]: 2026-02-23 08:43:10.193956713 +0000 UTC m=+0.348539907 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5) Feb 23 03:43:10 localhost podman[97906]: 2026-02-23 08:43:10.24864449 +0000 UTC m=+0.403227714 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, release=1766032510, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1) Feb 23 03:43:10 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:43:10 localhost systemd[1]: tmp-crun.zw2yvM.mount: Deactivated successfully. Feb 23 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:43:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:43:13 localhost podman[98012]: 2026-02-23 08:43:13.923994173 +0000 UTC m=+0.091520894 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=ovn_controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:43:13 localhost podman[98012]: 2026-02-23 08:43:13.939451644 +0000 UTC m=+0.106978395 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1) Feb 23 03:43:13 localhost podman[98012]: unhealthy Feb 23 03:43:13 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:43:13 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:43:14 localhost systemd[1]: tmp-crun.Yxb39x.mount: Deactivated successfully. Feb 23 03:43:14 localhost podman[98013]: 2026-02-23 08:43:14.032270566 +0000 UTC m=+0.198808496 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, architecture=x86_64, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git) Feb 23 03:43:14 localhost podman[98013]: 2026-02-23 08:43:14.077403411 +0000 UTC m=+0.243941391 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public) Feb 23 03:43:14 localhost podman[98013]: unhealthy Feb 23 03:43:14 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:43:14 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:43:14 localhost podman[98014]: 2026-02-23 08:43:14.099930769 +0000 UTC m=+0.258838725 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=metrics_qdr, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:43:14 localhost podman[98011]: 2026-02-23 08:43:14.134724045 +0000 UTC m=+0.305136178 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4) Feb 23 03:43:14 localhost podman[98014]: 2026-02-23 08:43:14.309327877 +0000 UTC m=+0.468235773 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, release=1766032510, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 23 03:43:14 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:43:14 localhost podman[98011]: 2026-02-23 08:43:14.503385289 +0000 UTC m=+0.673797372 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=nova_migration_target, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 03:43:14 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:43:14 localhost systemd[1]: tmp-crun.CCby9S.mount: Deactivated successfully. Feb 23 03:43:23 localhost sshd[98104]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:43:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:43:34 localhost systemd[1]: tmp-crun.GEDYu1.mount: Deactivated successfully. Feb 23 03:43:34 localhost podman[98106]: 2026-02-23 08:43:34.94295472 +0000 UTC m=+0.104173503 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, release=1766032510, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible) Feb 23 03:43:34 localhost podman[98106]: 2026-02-23 08:43:34.95144578 +0000 UTC m=+0.112664573 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, architecture=x86_64, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:43:34 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:43:40 localhost sshd[98127]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:43:40 localhost systemd[1]: tmp-crun.ZCXlBN.mount: Deactivated successfully. Feb 23 03:43:40 localhost podman[98129]: 2026-02-23 08:43:40.913626306 +0000 UTC m=+0.088761534 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:43:40 localhost podman[98132]: 2026-02-23 08:43:40.924998137 +0000 UTC m=+0.090242090 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron) Feb 23 03:43:40 localhost podman[98130]: 2026-02-23 08:43:40.964846674 +0000 UTC m=+0.136162030 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 23 03:43:40 localhost podman[98138]: 2026-02-23 08:43:40.976579738 +0000 UTC m=+0.138581048 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T23:32:04Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Feb 23 03:43:41 localhost podman[98138]: 2026-02-23 08:43:41.005293911 +0000 UTC m=+0.167295251 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, container_name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:43:41 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:43:41 localhost podman[98130]: 2026-02-23 08:43:41.021664312 +0000 UTC m=+0.192979688 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:43:41 localhost podman[98131]: 2026-02-23 08:43:41.021942241 +0000 UTC m=+0.191518753 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 23 03:43:41 localhost podman[98132]: 2026-02-23 08:43:41.040268463 +0000 UTC m=+0.205512506 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 23 03:43:41 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:43:41 localhost podman[98129]: 2026-02-23 08:43:41.057368867 +0000 UTC m=+0.232504055 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=) Feb 23 03:43:41 localhost podman[98131]: 2026-02-23 08:43:41.072279971 +0000 UTC m=+0.241856463 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=) Feb 23 03:43:41 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:43:41 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:43:41 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:43:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:43:44 localhost systemd[1]: tmp-crun.8foDq2.mount: Deactivated successfully. Feb 23 03:43:44 localhost podman[98244]: 2026-02-23 08:43:44.926222962 +0000 UTC m=+0.102068367 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:43:44 localhost podman[98246]: 2026-02-23 08:43:44.970034735 +0000 UTC m=+0.138408722 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, io.openshift.expose-services=) Feb 23 03:43:45 localhost podman[98246]: 2026-02-23 08:43:45.013440436 +0000 UTC m=+0.181814443 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 03:43:45 localhost podman[98246]: unhealthy Feb 23 03:43:45 localhost podman[98245]: 2026-02-23 08:43:45.020591883 +0000 UTC m=+0.191237702 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com) Feb 23 03:43:45 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:43:45 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:43:45 localhost podman[98245]: 2026-02-23 08:43:45.06324959 +0000 UTC m=+0.233895419 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 23 03:43:45 localhost podman[98245]: unhealthy Feb 23 03:43:45 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:43:45 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:43:45 localhost podman[98248]: 2026-02-23 08:43:45.083131652 +0000 UTC m=+0.247770050 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:43:45 localhost podman[98248]: 2026-02-23 08:43:45.273799385 +0000 UTC m=+0.438437783 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:43:45 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:43:45 localhost podman[98244]: 2026-02-23 08:43:45.317212076 +0000 UTC m=+0.493057471 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target) Feb 23 03:43:45 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:44:01 localhost sshd[98409]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:44:01 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:44:01 localhost recover_tripleo_nova_virtqemud[98412]: 61982 Feb 23 03:44:01 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:44:01 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:44:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:44:05 localhost systemd[1]: tmp-crun.QEqpTa.mount: Deactivated successfully. Feb 23 03:44:05 localhost podman[98413]: 2026-02-23 08:44:05.931608898 +0000 UTC m=+0.102392868 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Feb 23 03:44:05 localhost podman[98413]: 2026-02-23 08:44:05.946232183 +0000 UTC m=+0.117016183 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, container_name=collectd, release=1766032510, config_id=tripleo_step3, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:44:05 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:44:12 localhost podman[98435]: 2026-02-23 08:44:11.951244872 +0000 UTC m=+0.116592388 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.13) Feb 23 03:44:12 localhost podman[98434]: 2026-02-23 08:44:12.003069351 +0000 UTC m=+0.170806743 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5) Feb 23 03:44:12 localhost podman[98441]: 2026-02-23 08:44:12.012258913 +0000 UTC m=+0.163146959 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_compute, vcs-type=git, version=17.1.13) Feb 23 03:44:12 localhost podman[98434]: 2026-02-23 08:44:12.043748384 +0000 UTC m=+0.211485726 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public) Feb 23 03:44:12 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:44:12 localhost podman[98436]: 2026-02-23 08:44:11.912407338 +0000 UTC m=+0.078580231 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 03:44:12 localhost podman[98435]: 2026-02-23 08:44:12.087363412 +0000 UTC m=+0.252710938 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, vcs-type=git, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.expose-services=) Feb 23 03:44:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:44:12 localhost podman[98436]: 2026-02-23 08:44:12.101190211 +0000 UTC m=+0.267363124 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible) Feb 23 03:44:12 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:44:12 localhost podman[98437]: 2026-02-23 08:44:12.151304165 +0000 UTC m=+0.307793729 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:44:12 localhost podman[98441]: 2026-02-23 08:44:12.169660268 +0000 UTC m=+0.320548284 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Feb 23 03:44:12 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:44:12 localhost podman[98437]: 2026-02-23 08:44:12.18730341 +0000 UTC m=+0.343793024 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron) Feb 23 03:44:12 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:44:12 localhost systemd[1]: tmp-crun.cngTkO.mount: Deactivated successfully. Feb 23 03:44:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:44:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:44:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:44:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:44:15 localhost podman[98552]: 2026-02-23 08:44:15.930485719 +0000 UTC m=+0.098633357 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 23 03:44:15 localhost podman[98553]: 2026-02-23 08:44:15.905924548 +0000 UTC m=+0.074450609 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, release=1766032510) Feb 23 03:44:15 localhost podman[98552]: 2026-02-23 08:44:15.99497143 +0000 UTC m=+0.163119058 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git) Feb 23 03:44:16 localhost podman[98552]: unhealthy Feb 23 03:44:16 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:44:16 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:44:16 localhost podman[98551]: 2026-02-23 08:44:16.017541487 +0000 UTC m=+0.189040953 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:44:16 localhost podman[98554]: 2026-02-23 08:44:15.986064937 +0000 UTC m=+0.147631396 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:44:16 localhost podman[98553]: 2026-02-23 08:44:16.08900802 +0000 UTC m=+0.257534091 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, architecture=x86_64, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:44:16 localhost podman[98553]: unhealthy Feb 23 03:44:16 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:44:16 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:44:16 localhost podman[98554]: 2026-02-23 08:44:16.188278077 +0000 UTC m=+0.349844536 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, batch=17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:44:16 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:44:16 localhost podman[98551]: 2026-02-23 08:44:16.405168024 +0000 UTC m=+0.576667480 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510) Feb 23 03:44:16 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:44:24 localhost sshd[98642]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:44:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:44:36 localhost systemd[1]: tmp-crun.N18ZRg.mount: Deactivated successfully. Feb 23 03:44:36 localhost podman[98644]: 2026-02-23 08:44:36.924987958 +0000 UTC m=+0.099321079 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Feb 23 03:44:36 localhost podman[98644]: 2026-02-23 08:44:36.940211223 +0000 UTC m=+0.114544374 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:44:36 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:44:38 localhost sshd[98663]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:44:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:44:42 localhost podman[98666]: 2026-02-23 08:44:42.932645113 +0000 UTC m=+0.098673610 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git) Feb 23 03:44:42 localhost podman[98666]: 2026-02-23 08:44:42.963308648 +0000 UTC m=+0.129337145 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 23 03:44:42 localhost systemd[1]: tmp-crun.tQqJRw.mount: Deactivated successfully. Feb 23 03:44:42 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:44:42 localhost podman[98668]: 2026-02-23 08:44:42.996950588 +0000 UTC m=+0.158182252 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 23 03:44:43 localhost podman[98665]: 2026-02-23 08:44:43.037199908 +0000 UTC m=+0.206572791 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, version=17.1.13, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z) Feb 23 03:44:43 localhost podman[98665]: 2026-02-23 08:44:43.045479451 +0000 UTC m=+0.214852324 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3) Feb 23 03:44:43 localhost podman[98668]: 2026-02-23 08:44:43.056944446 +0000 UTC m=+0.218176160 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-cron, tcib_managed=true, version=17.1.13, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:44:43 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:44:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:44:43 localhost podman[98667]: 2026-02-23 08:44:43.143340283 +0000 UTC m=+0.307702596 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 23 03:44:43 localhost podman[98671]: 2026-02-23 08:44:43.197490315 +0000 UTC m=+0.352062697 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 23 03:44:43 localhost podman[98667]: 2026-02-23 08:44:43.201189752 +0000 UTC m=+0.365552045 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 23 03:44:43 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:44:43 localhost podman[98671]: 2026-02-23 08:44:43.229860515 +0000 UTC m=+0.384432907 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:44:43 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:44:43 localhost sshd[98782]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:44:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:44:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:44:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:44:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:44:46 localhost podman[98784]: 2026-02-23 08:44:46.92069784 +0000 UTC m=+0.090641293 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:44:46 localhost systemd[1]: tmp-crun.yY256z.mount: Deactivated successfully. Feb 23 03:44:46 localhost podman[98785]: 2026-02-23 08:44:46.98200868 +0000 UTC m=+0.147789361 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:44:47 localhost podman[98787]: 2026-02-23 08:44:47.027222738 +0000 UTC m=+0.187280967 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:44:47 localhost podman[98785]: 2026-02-23 08:44:47.050409896 +0000 UTC m=+0.216190587 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 23 03:44:47 localhost podman[98785]: unhealthy Feb 23 03:44:47 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:44:47 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:44:47 localhost podman[98786]: 2026-02-23 08:44:47.137906678 +0000 UTC m=+0.300771466 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510) Feb 23 03:44:47 localhost podman[98786]: 2026-02-23 08:44:47.156575701 +0000 UTC m=+0.319440519 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z) Feb 23 03:44:47 localhost podman[98786]: unhealthy Feb 23 03:44:47 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:44:47 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:44:47 localhost podman[98787]: 2026-02-23 08:44:47.247245265 +0000 UTC m=+0.407303454 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1) Feb 23 03:44:47 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:44:47 localhost podman[98784]: 2026-02-23 08:44:47.337433523 +0000 UTC m=+0.507376986 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Feb 23 03:44:47 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:44:47 localhost systemd[1]: tmp-crun.Dhens2.mount: Deactivated successfully. Feb 23 03:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:45:07 localhost podman[98954]: 2026-02-23 08:45:07.938950875 +0000 UTC m=+0.106402305 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_id=tripleo_step3) Feb 23 03:45:07 localhost podman[98954]: 2026-02-23 08:45:07.975934691 +0000 UTC m=+0.143386121 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, container_name=collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-type=git, build-date=2026-01-12T22:10:15Z, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:45:07 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:45:10 localhost sshd[98973]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:45:13 localhost systemd[1]: tmp-crun.GZ1t3m.mount: Deactivated successfully. Feb 23 03:45:13 localhost podman[98976]: 2026-02-23 08:45:13.921905455 +0000 UTC m=+0.091980507 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, architecture=x86_64, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git) Feb 23 03:45:13 localhost systemd[1]: tmp-crun.aY5ypL.mount: Deactivated successfully. Feb 23 03:45:13 localhost podman[98978]: 2026-02-23 08:45:13.950677119 +0000 UTC m=+0.110353810 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, container_name=logrotate_crond, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:45:13 localhost podman[98976]: 2026-02-23 08:45:13.980288752 +0000 UTC m=+0.150363794 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible) Feb 23 03:45:13 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:45:14 localhost podman[98975]: 2026-02-23 08:45:13.981175509 +0000 UTC m=+0.152146559 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Feb 23 03:45:14 localhost podman[98977]: 2026-02-23 08:45:14.037350776 +0000 UTC m=+0.200225129 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, architecture=x86_64, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:45:14 localhost podman[98989]: 2026-02-23 08:45:14.09031844 +0000 UTC m=+0.242998659 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=nova_compute, release=1766032510, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13) Feb 23 03:45:14 localhost podman[98975]: 2026-02-23 08:45:14.115836452 +0000 UTC m=+0.286807572 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1766032510, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:45:14 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:45:14 localhost podman[98978]: 2026-02-23 08:45:14.139299068 +0000 UTC m=+0.298975809 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:45:14 localhost podman[98989]: 2026-02-23 08:45:14.150265326 +0000 UTC m=+0.302945505 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=nova_compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:45:14 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:45:14 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:45:14 localhost podman[98977]: 2026-02-23 08:45:14.170771779 +0000 UTC m=+0.333646152 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4) Feb 23 03:45:14 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:45:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:45:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:45:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:45:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:45:17 localhost systemd[1]: tmp-crun.PtZjaP.mount: Deactivated successfully. Feb 23 03:45:17 localhost podman[99093]: 2026-02-23 08:45:17.938032423 +0000 UTC m=+0.103635107 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510) Feb 23 03:45:18 localhost podman[99094]: 2026-02-23 08:45:17.8989779 +0000 UTC m=+0.065856734 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5) Feb 23 03:45:18 localhost podman[99092]: 2026-02-23 08:45:18.00492655 +0000 UTC m=+0.174963775 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team) Feb 23 03:45:18 localhost podman[99096]: 2026-02-23 08:45:18.012710028 +0000 UTC m=+0.173757827 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Feb 23 03:45:18 localhost podman[99093]: 2026-02-23 08:45:18.027186598 +0000 UTC m=+0.192789252 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ovn_controller, managed_by=tripleo_ansible) Feb 23 03:45:18 localhost podman[99093]: unhealthy Feb 23 03:45:18 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:45:18 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:45:18 localhost podman[99094]: 2026-02-23 08:45:18.082853539 +0000 UTC m=+0.249732373 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 03:45:18 localhost podman[99094]: unhealthy Feb 23 03:45:18 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:45:18 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:45:18 localhost sshd[99178]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:45:18 localhost podman[99096]: 2026-02-23 08:45:18.203266398 +0000 UTC m=+0.364314187 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public) Feb 23 03:45:18 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:45:18 localhost podman[99092]: 2026-02-23 08:45:18.369401351 +0000 UTC m=+0.539438606 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:45:18 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:45:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:45:37 localhost recover_tripleo_nova_virtqemud[99182]: 61982 Feb 23 03:45:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:45:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:45:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:45:38 localhost podman[99183]: 2026-02-23 08:45:38.916707629 +0000 UTC m=+0.090360130 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:45:38 localhost podman[99183]: 2026-02-23 08:45:38.926408493 +0000 UTC m=+0.100060964 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, tcib_managed=true, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:45:38 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:45:44 localhost systemd[1]: tmp-crun.LZgVpk.mount: Deactivated successfully. Feb 23 03:45:44 localhost podman[99204]: 2026-02-23 08:45:44.983643962 +0000 UTC m=+0.152399440 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5) Feb 23 03:45:45 localhost podman[99205]: 2026-02-23 08:45:44.94984074 +0000 UTC m=+0.115913544 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc.) Feb 23 03:45:45 localhost podman[99205]: 2026-02-23 08:45:45.028999257 +0000 UTC m=+0.195072021 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 23 03:45:45 localhost podman[99206]: 2026-02-23 08:45:45.036894435 +0000 UTC m=+0.198799657 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git) Feb 23 03:45:45 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:45:45 localhost podman[99210]: 2026-02-23 08:45:45.092183161 +0000 UTC m=+0.248513378 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:45:45 localhost podman[99206]: 2026-02-23 08:45:45.145760055 +0000 UTC m=+0.307665277 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:45:45 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:45:45 localhost podman[99204]: 2026-02-23 08:45:45.172360301 +0000 UTC m=+0.341115809 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, container_name=iscsid, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, tcib_managed=true, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:45:45 localhost podman[99210]: 2026-02-23 08:45:45.181331362 +0000 UTC m=+0.337661589 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:45:45 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:45:45 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:45:45 localhost podman[99218]: 2026-02-23 08:45:45.151187055 +0000 UTC m=+0.302200625 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, release=1766032510, architecture=x86_64) Feb 23 03:45:45 localhost podman[99218]: 2026-02-23 08:45:45.231030674 +0000 UTC m=+0.382044204 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true) Feb 23 03:45:45 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:45:45 localhost systemd[1]: tmp-crun.0NbyUE.mount: Deactivated successfully. Feb 23 03:45:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:45:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:45:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:45:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:45:48 localhost systemd[1]: tmp-crun.GQeuwf.mount: Deactivated successfully. Feb 23 03:45:48 localhost podman[99318]: 2026-02-23 08:45:48.924009411 +0000 UTC m=+0.096418600 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64) Feb 23 03:45:48 localhost podman[99320]: 2026-02-23 08:45:48.974117575 +0000 UTC m=+0.141414644 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, tcib_managed=true) Feb 23 03:45:49 localhost podman[99319]: 2026-02-23 08:45:49.019201181 +0000 UTC m=+0.187447880 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z) Feb 23 03:45:49 localhost podman[99320]: 2026-02-23 08:45:49.044744034 +0000 UTC m=+0.212041143 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, version=17.1.13) Feb 23 03:45:49 localhost podman[99320]: unhealthy Feb 23 03:45:49 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:45:49 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:45:49 localhost podman[99319]: 2026-02-23 08:45:49.061294364 +0000 UTC m=+0.229541033 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 03:45:49 localhost podman[99319]: unhealthy Feb 23 03:45:49 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:45:49 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:45:49 localhost podman[99321]: 2026-02-23 08:45:49.130920381 +0000 UTC m=+0.294128362 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true) Feb 23 03:45:49 localhost podman[99321]: 2026-02-23 08:45:49.331317478 +0000 UTC m=+0.494525519 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, version=17.1.13, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:45:49 localhost podman[99318]: 2026-02-23 08:45:49.341350443 +0000 UTC m=+0.513759662 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, url=https://www.redhat.com, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:45:49 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:45:49 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:45:53 localhost sshd[99481]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:45:58 localhost sshd[99483]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:46:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:46:09 localhost systemd[1]: tmp-crun.PxAnur.mount: Deactivated successfully. Feb 23 03:46:09 localhost podman[99485]: 2026-02-23 08:46:09.934636815 +0000 UTC m=+0.098085433 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Feb 23 03:46:09 localhost podman[99485]: 2026-02-23 08:46:09.950239556 +0000 UTC m=+0.113688214 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-collectd-container) Feb 23 03:46:09 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:46:15 localhost rhsm-service[6591]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Feb 23 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:46:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:46:15 localhost podman[99684]: 2026-02-23 08:46:15.929608837 +0000 UTC m=+0.095586825 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:46:15 localhost podman[99684]: 2026-02-23 08:46:15.968044254 +0000 UTC m=+0.134022162 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4) Feb 23 03:46:15 localhost podman[99686]: 2026-02-23 08:46:15.965480734 +0000 UTC m=+0.124735800 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=logrotate_crond, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:46:15 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:46:16 localhost podman[99683]: 2026-02-23 08:46:16.040940605 +0000 UTC m=+0.210157244 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:46:16 localhost podman[99686]: 2026-02-23 08:46:16.055316566 +0000 UTC m=+0.214571652 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, tcib_managed=true, com.redhat.component=openstack-cron-container) Feb 23 03:46:16 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:46:16 localhost podman[99685]: 2026-02-23 08:46:16.133862474 +0000 UTC m=+0.297281431 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 23 03:46:16 localhost podman[99683]: 2026-02-23 08:46:16.153044487 +0000 UTC m=+0.322261116 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3) Feb 23 03:46:16 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:46:16 localhost podman[99685]: 2026-02-23 08:46:16.197281157 +0000 UTC m=+0.360700084 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public) Feb 23 03:46:16 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:46:16 localhost podman[99692]: 2026-02-23 08:46:16.248911058 +0000 UTC m=+0.405729158 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:46:16 localhost podman[99692]: 2026-02-23 08:46:16.280293244 +0000 UTC m=+0.437111304 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step5, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:46:16 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:46:16 localhost systemd[1]: tmp-crun.R7bTyC.mount: Deactivated successfully. Feb 23 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:46:20 localhost systemd[1]: tmp-crun.zNZV86.mount: Deactivated successfully. Feb 23 03:46:20 localhost podman[99803]: 2026-02-23 08:46:20.210098063 +0000 UTC m=+0.053030530 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true) Feb 23 03:46:20 localhost podman[99803]: 2026-02-23 08:46:20.228032188 +0000 UTC m=+0.070964635 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:46:20 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Deactivated successfully. Feb 23 03:46:20 localhost systemd[1]: tmp-crun.p9Mrkl.mount: Deactivated successfully. Feb 23 03:46:20 localhost podman[99802]: 2026-02-23 08:46:20.261921384 +0000 UTC m=+0.104927153 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, tcib_managed=true) Feb 23 03:46:20 localhost podman[99805]: 2026-02-23 08:46:20.312964621 +0000 UTC m=+0.150726365 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com) Feb 23 03:46:20 localhost podman[99804]: 2026-02-23 08:46:20.363918314 +0000 UTC m=+0.203193225 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:46:20 localhost podman[99804]: 2026-02-23 08:46:20.421225088 +0000 UTC m=+0.260500009 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc.) Feb 23 03:46:20 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Deactivated successfully. Feb 23 03:46:20 localhost podman[99805]: 2026-02-23 08:46:20.470184138 +0000 UTC m=+0.307945882 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, url=https://www.redhat.com, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, config_id=tripleo_step1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:46:20 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:46:20 localhost podman[99802]: 2026-02-23 08:46:20.609212754 +0000 UTC m=+0.452218523 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:46:20 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:46:34 localhost systemd[1]: session-28.scope: Deactivated successfully. Feb 23 03:46:34 localhost systemd[1]: session-28.scope: Consumed 7min 16.051s CPU time. Feb 23 03:46:34 localhost systemd-logind[759]: Session 28 logged out. Waiting for processes to exit. Feb 23 03:46:34 localhost systemd-logind[759]: Removed session 28. Feb 23 03:46:35 localhost sshd[99902]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:46:39 localhost sshd[99904]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:46:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:46:40 localhost podman[99906]: 2026-02-23 08:46:40.537219425 +0000 UTC m=+0.100674334 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd) Feb 23 03:46:40 localhost podman[99906]: 2026-02-23 08:46:40.578357878 +0000 UTC m=+0.141812767 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:46:40 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:46:45 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 23 03:46:45 localhost systemd[35778]: Activating special unit Exit the Session... Feb 23 03:46:45 localhost systemd[35778]: Removed slice User Background Tasks Slice. Feb 23 03:46:45 localhost systemd[35778]: Stopped target Main User Target. Feb 23 03:46:45 localhost systemd[35778]: Stopped target Basic System. Feb 23 03:46:45 localhost systemd[35778]: Stopped target Paths. Feb 23 03:46:45 localhost systemd[35778]: Stopped target Sockets. Feb 23 03:46:45 localhost systemd[35778]: Stopped target Timers. Feb 23 03:46:45 localhost systemd[35778]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 23 03:46:45 localhost systemd[35778]: Stopped Daily Cleanup of User's Temporary Directories. Feb 23 03:46:45 localhost systemd[35778]: Closed D-Bus User Message Bus Socket. Feb 23 03:46:45 localhost systemd[35778]: Stopped Create User's Volatile Files and Directories. Feb 23 03:46:45 localhost systemd[35778]: Removed slice User Application Slice. Feb 23 03:46:45 localhost systemd[35778]: Reached target Shutdown. Feb 23 03:46:45 localhost systemd[35778]: Finished Exit the Session. Feb 23 03:46:45 localhost systemd[35778]: Reached target Exit the Session. Feb 23 03:46:45 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 23 03:46:45 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 23 03:46:45 localhost systemd[1]: user@1003.service: Consumed 4.890s CPU time, read 0B from disk, written 7.0K to disk. Feb 23 03:46:45 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 23 03:46:45 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 23 03:46:45 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 23 03:46:45 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 23 03:46:45 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 23 03:46:45 localhost systemd[1]: user-1003.slice: Consumed 7min 20.971s CPU time. Feb 23 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:46:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:46:46 localhost podman[99930]: 2026-02-23 08:46:46.946965056 +0000 UTC m=+0.110118881 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, version=17.1.13) Feb 23 03:46:46 localhost systemd[1]: tmp-crun.V5Z92e.mount: Deactivated successfully. Feb 23 03:46:46 localhost podman[99938]: 2026-02-23 08:46:46.994701116 +0000 UTC m=+0.148055502 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:46:47 localhost podman[99930]: 2026-02-23 08:46:47.000268951 +0000 UTC m=+0.163422746 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, build-date=2026-01-12T23:07:47Z) Feb 23 03:46:47 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:46:47 localhost podman[99938]: 2026-02-23 08:46:47.025829235 +0000 UTC m=+0.179183611 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:46:47 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:46:47 localhost podman[99931]: 2026-02-23 08:46:47.043973674 +0000 UTC m=+0.204262298 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:46:47 localhost podman[99932]: 2026-02-23 08:46:47.098968892 +0000 UTC m=+0.255185218 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, config_id=tripleo_step4) Feb 23 03:46:47 localhost podman[99932]: 2026-02-23 08:46:47.111431904 +0000 UTC m=+0.267648240 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Feb 23 03:46:47 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:46:47 localhost podman[99931]: 2026-02-23 08:46:47.127283342 +0000 UTC m=+0.287571956 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64) Feb 23 03:46:47 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:46:47 localhost podman[99929]: 2026-02-23 08:46:47.203776395 +0000 UTC m=+0.372616748 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:46:47 localhost podman[99929]: 2026-02-23 08:46:47.239368333 +0000 UTC m=+0.408208706 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:46:47 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:46:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:46:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:46:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:46:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:46:50 localhost podman[100044]: 2026-02-23 08:46:50.923017988 +0000 UTC m=+0.093957003 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:46:50 localhost podman[100044]: 2026-02-23 08:46:50.96733604 +0000 UTC m=+0.138275075 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:46:50 localhost systemd[1]: tmp-crun.cqLtvL.mount: Deactivated successfully. Feb 23 03:46:50 localhost podman[100044]: unhealthy Feb 23 03:46:50 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:46:50 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:46:51 localhost podman[100045]: 2026-02-23 08:46:51.025092125 +0000 UTC m=+0.192789709 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team) Feb 23 03:46:51 localhost podman[100045]: 2026-02-23 08:46:51.039137066 +0000 UTC m=+0.206834650 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:46:51 localhost podman[100045]: unhealthy Feb 23 03:46:51 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:46:51 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:46:51 localhost podman[100046]: 2026-02-23 08:46:50.992985626 +0000 UTC m=+0.157307243 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:46:51 localhost podman[100043]: 2026-02-23 08:46:51.180705484 +0000 UTC m=+0.354160708 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:46:51 localhost podman[100046]: 2026-02-23 08:46:51.224463088 +0000 UTC m=+0.388784715 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=metrics_qdr) Feb 23 03:46:51 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:46:51 localhost podman[100043]: 2026-02-23 08:46:51.568545549 +0000 UTC m=+0.742000763 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:46:51 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:46:53 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:46:53 localhost recover_tripleo_nova_virtqemud[100211]: 61982 Feb 23 03:46:53 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:46:53 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:46:56 localhost sshd[100212]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:47:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:47:10 localhost systemd[1]: tmp-crun.i1k5MN.mount: Deactivated successfully. Feb 23 03:47:10 localhost podman[100213]: 2026-02-23 08:47:10.943530014 +0000 UTC m=+0.113775565 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:47:10 localhost podman[100213]: 2026-02-23 08:47:10.956625316 +0000 UTC m=+0.126870827 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, batch=17.1_20260112.1, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:47:10 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:47:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:47:17 localhost podman[100235]: 2026-02-23 08:47:17.937790572 +0000 UTC m=+0.099420394 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron) Feb 23 03:47:17 localhost systemd[1]: tmp-crun.r8Bz9o.mount: Deactivated successfully. Feb 23 03:47:17 localhost podman[100235]: 2026-02-23 08:47:17.977292813 +0000 UTC m=+0.138922665 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible) Feb 23 03:47:17 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:47:18 localhost systemd[1]: tmp-crun.nXqDK5.mount: Deactivated successfully. Feb 23 03:47:18 localhost podman[100233]: 2026-02-23 08:47:18.034209671 +0000 UTC m=+0.201362797 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:47:18 localhost podman[100241]: 2026-02-23 08:47:18.05104583 +0000 UTC m=+0.205704204 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, version=17.1.13, vcs-type=git) Feb 23 03:47:18 localhost podman[100233]: 2026-02-23 08:47:18.06536789 +0000 UTC m=+0.232521036 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:47:18 localhost podman[100241]: 2026-02-23 08:47:18.075858759 +0000 UTC m=+0.230517203 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 23 03:47:18 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:47:18 localhost podman[100234]: 2026-02-23 08:47:17.983689863 +0000 UTC m=+0.147975729 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git) Feb 23 03:47:18 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:47:18 localhost podman[100234]: 2026-02-23 08:47:18.118299973 +0000 UTC m=+0.282585839 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:07:30Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4) Feb 23 03:47:18 localhost podman[100232]: 2026-02-23 08:47:18.077340196 +0000 UTC m=+0.250409249 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com) Feb 23 03:47:18 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:47:18 localhost podman[100232]: 2026-02-23 08:47:18.163370669 +0000 UTC m=+0.336439732 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.buildah.version=1.41.5) Feb 23 03:47:18 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:47:20 localhost sshd[100353]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:47:21 localhost sshd[100355]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:47:21 localhost podman[100365]: 2026-02-23 08:47:21.92138498 +0000 UTC m=+0.077039861 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git) Feb 23 03:47:21 localhost systemd[1]: tmp-crun.K1XcpV.mount: Deactivated successfully. Feb 23 03:47:21 localhost podman[100357]: 2026-02-23 08:47:21.978251186 +0000 UTC m=+0.144389547 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, container_name=nova_migration_target, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:47:22 localhost podman[100358]: 2026-02-23 08:47:22.025965465 +0000 UTC m=+0.187178201 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5) Feb 23 03:47:22 localhost podman[100359]: 2026-02-23 08:47:21.95258989 +0000 UTC m=+0.108469289 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:47:22 localhost podman[100358]: 2026-02-23 08:47:22.068105399 +0000 UTC m=+0.229318155 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 23 03:47:22 localhost podman[100358]: unhealthy Feb 23 03:47:22 localhost podman[100359]: 2026-02-23 08:47:22.086426265 +0000 UTC m=+0.242305674 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:47:22 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:47:22 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:47:22 localhost podman[100359]: unhealthy Feb 23 03:47:22 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:47:22 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:47:22 localhost podman[100365]: 2026-02-23 08:47:22.147493004 +0000 UTC m=+0.303147945 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:47:22 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:47:22 localhost podman[100357]: 2026-02-23 08:47:22.363200101 +0000 UTC m=+0.529338502 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com) Feb 23 03:47:22 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:47:41 localhost podman[100445]: 2026-02-23 08:47:41.917131215 +0000 UTC m=+0.091014698 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13) Feb 23 03:47:41 localhost podman[100445]: 2026-02-23 08:47:41.95732987 +0000 UTC m=+0.131213323 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team) Feb 23 03:47:41 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:47:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:47:48 localhost podman[100472]: 2026-02-23 08:47:48.936863091 +0000 UTC m=+0.097411960 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git) Feb 23 03:47:48 localhost podman[100472]: 2026-02-23 08:47:48.970725677 +0000 UTC m=+0.131274556 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:47:48 localhost systemd[1]: tmp-crun.Z33k5t.mount: Deactivated successfully. Feb 23 03:47:48 localhost podman[100468]: 2026-02-23 08:47:48.983898661 +0000 UTC m=+0.148482177 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13) Feb 23 03:47:48 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:47:49 localhost podman[100466]: 2026-02-23 08:47:49.024374817 +0000 UTC m=+0.196162860 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1) Feb 23 03:47:49 localhost podman[100481]: 2026-02-23 08:47:48.988159496 +0000 UTC m=+0.144760401 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:47:49 localhost podman[100466]: 2026-02-23 08:47:49.06227916 +0000 UTC m=+0.234067213 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public) Feb 23 03:47:49 localhost podman[100481]: 2026-02-23 08:47:49.071279774 +0000 UTC m=+0.227880719 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:47:49 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:47:49 localhost podman[100467]: 2026-02-23 08:47:49.079502472 +0000 UTC m=+0.247981520 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc.) Feb 23 03:47:49 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:47:49 localhost podman[100468]: 2026-02-23 08:47:49.090692045 +0000 UTC m=+0.255275631 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 03:47:49 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:47:49 localhost podman[100467]: 2026-02-23 08:47:49.135360231 +0000 UTC m=+0.303839299 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Feb 23 03:47:49 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:47:52 localhost systemd[1]: tmp-crun.TDMUWJ.mount: Deactivated successfully. Feb 23 03:47:52 localhost podman[100579]: 2026-02-23 08:47:52.912990828 +0000 UTC m=+0.086737552 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:47:52 localhost podman[100580]: 2026-02-23 08:47:52.984538991 +0000 UTC m=+0.156009684 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:47:53 localhost podman[100580]: 2026-02-23 08:47:53.027495425 +0000 UTC m=+0.198966088 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:47:53 localhost podman[100580]: unhealthy Feb 23 03:47:53 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:47:53 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:47:53 localhost podman[100581]: 2026-02-23 08:47:52.945795702 +0000 UTC m=+0.114363273 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, container_name=ovn_metadata_agent, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:47:53 localhost podman[100582]: 2026-02-23 08:47:53.029023812 +0000 UTC m=+0.195190477 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:47:53 localhost podman[100581]: 2026-02-23 08:47:53.082522067 +0000 UTC m=+0.251089728 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com) Feb 23 03:47:53 localhost podman[100581]: unhealthy Feb 23 03:47:53 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:47:53 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:47:53 localhost podman[100582]: 2026-02-23 08:47:53.274428932 +0000 UTC m=+0.440595597 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1) Feb 23 03:47:53 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:47:53 localhost podman[100579]: 2026-02-23 08:47:53.327123672 +0000 UTC m=+0.500870336 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, version=17.1.13, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:47:53 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:48:00 localhost sshd[100747]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:48:06 localhost sshd[100749]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:48:12 localhost systemd[1]: tmp-crun.SoYhGy.mount: Deactivated successfully. Feb 23 03:48:12 localhost podman[100751]: 2026-02-23 08:48:12.943751967 +0000 UTC m=+0.108172758 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510) Feb 23 03:48:12 localhost podman[100751]: 2026-02-23 08:48:12.9594136 +0000 UTC m=+0.123834351 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, version=17.1.13, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=) Feb 23 03:48:12 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:48:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:48:19 localhost systemd[1]: tmp-crun.nPw427.mount: Deactivated successfully. Feb 23 03:48:19 localhost podman[100774]: 2026-02-23 08:48:19.933603242 +0000 UTC m=+0.092342530 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510) Feb 23 03:48:19 localhost podman[100772]: 2026-02-23 08:48:19.98051963 +0000 UTC m=+0.148145487 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:48:19 localhost podman[100771]: 2026-02-23 08:48:19.996503893 +0000 UTC m=+0.165528494 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:48:20 localhost podman[100787]: 2026-02-23 08:48:19.953882951 +0000 UTC m=+0.102943113 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=nova_compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:48:20 localhost podman[100771]: 2026-02-23 08:48:20.008637355 +0000 UTC m=+0.177661946 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:48:20 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:48:20 localhost podman[100787]: 2026-02-23 08:48:20.036436951 +0000 UTC m=+0.185497133 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step5, version=17.1.13, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:48:20 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:48:20 localhost podman[100773]: 2026-02-23 08:48:19.910473094 +0000 UTC m=+0.075459978 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com) Feb 23 03:48:20 localhost podman[100774]: 2026-02-23 08:48:20.065224717 +0000 UTC m=+0.223964035 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:48:20 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:48:20 localhost podman[100772]: 2026-02-23 08:48:20.084642769 +0000 UTC m=+0.252268636 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-type=git, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 23 03:48:20 localhost podman[100773]: 2026-02-23 08:48:20.096370478 +0000 UTC m=+0.261357382 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Feb 23 03:48:20 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:48:20 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:48:23 localhost systemd[1]: tmp-crun.nKSRqe.mount: Deactivated successfully. Feb 23 03:48:23 localhost podman[100889]: 2026-02-23 08:48:23.947710257 +0000 UTC m=+0.099353200 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:48:23 localhost podman[100890]: 2026-02-23 08:48:23.991843737 +0000 UTC m=+0.140816346 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr) Feb 23 03:48:24 localhost podman[100889]: 2026-02-23 08:48:24.045498887 +0000 UTC m=+0.197141770 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:48:24 localhost podman[100889]: unhealthy Feb 23 03:48:24 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:48:24 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:48:24 localhost podman[100888]: 2026-02-23 08:48:24.046370354 +0000 UTC m=+0.202585461 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=) Feb 23 03:48:24 localhost podman[100887]: 2026-02-23 08:48:24.101791959 +0000 UTC m=+0.258236354 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:48:24 localhost podman[100888]: 2026-02-23 08:48:24.126277921 +0000 UTC m=+0.282493058 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Feb 23 03:48:24 localhost podman[100888]: unhealthy Feb 23 03:48:24 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:48:24 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:48:24 localhost podman[100890]: 2026-02-23 08:48:24.206133115 +0000 UTC m=+0.355105724 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, vcs-type=git, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Feb 23 03:48:24 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:48:24 localhost podman[100887]: 2026-02-23 08:48:24.480356742 +0000 UTC m=+0.636801137 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510) Feb 23 03:48:24 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:48:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:48:37 localhost recover_tripleo_nova_virtqemud[100981]: 61982 Feb 23 03:48:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:48:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:48:39 localhost sshd[100982]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:48:43 localhost systemd[1]: tmp-crun.9GoDjU.mount: Deactivated successfully. Feb 23 03:48:43 localhost podman[100984]: 2026-02-23 08:48:43.93763612 +0000 UTC m=+0.105332868 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true) Feb 23 03:48:43 localhost podman[100984]: 2026-02-23 08:48:43.987061717 +0000 UTC m=+0.154758495 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:48:44 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:48:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:48:50 localhost podman[101004]: 2026-02-23 08:48:50.921159687 +0000 UTC m=+0.095006433 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public) Feb 23 03:48:50 localhost podman[101004]: 2026-02-23 08:48:50.937270084 +0000 UTC m=+0.111116860 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, distribution-scope=public) Feb 23 03:48:50 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:48:50 localhost podman[101007]: 2026-02-23 08:48:50.984553893 +0000 UTC m=+0.145110411 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510) Feb 23 03:48:51 localhost podman[101007]: 2026-02-23 08:48:51.028298441 +0000 UTC m=+0.188854959 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Feb 23 03:48:51 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:48:51 localhost podman[101005]: 2026-02-23 08:48:51.081881908 +0000 UTC m=+0.250269902 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com) Feb 23 03:48:51 localhost podman[101006]: 2026-02-23 08:48:51.034357512 +0000 UTC m=+0.200175576 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:48:51 localhost podman[101006]: 2026-02-23 08:48:51.115327842 +0000 UTC m=+0.281145896 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 23 03:48:51 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:48:51 localhost podman[101012]: 2026-02-23 08:48:51.134652871 +0000 UTC m=+0.294369122 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64) Feb 23 03:48:51 localhost podman[101005]: 2026-02-23 08:48:51.164196632 +0000 UTC m=+0.332584596 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, distribution-scope=public, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 23 03:48:51 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:48:51 localhost podman[101012]: 2026-02-23 08:48:51.217698806 +0000 UTC m=+0.377415017 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, vcs-type=git, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13) Feb 23 03:48:51 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:48:51 localhost systemd[1]: tmp-crun.FfK6ZX.mount: Deactivated successfully. Feb 23 03:48:53 localhost sshd[101124]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:48:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:48:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:48:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:48:54 localhost systemd[1]: tmp-crun.gGYpGU.mount: Deactivated successfully. Feb 23 03:48:54 localhost podman[101126]: 2026-02-23 08:48:54.446119155 +0000 UTC m=+0.087436735 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.expose-services=) Feb 23 03:48:54 localhost podman[101126]: 2026-02-23 08:48:54.463612285 +0000 UTC m=+0.104929945 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vendor=Red Hat, Inc.) Feb 23 03:48:54 localhost systemd[1]: tmp-crun.V6bix7.mount: Deactivated successfully. Feb 23 03:48:54 localhost podman[101127]: 2026-02-23 08:48:54.506380263 +0000 UTC m=+0.144187603 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 03:48:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:48:54 localhost podman[101128]: 2026-02-23 08:48:54.550244964 +0000 UTC m=+0.184403059 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public) Feb 23 03:48:54 localhost podman[101126]: unhealthy Feb 23 03:48:54 localhost podman[101127]: 2026-02-23 08:48:54.574018263 +0000 UTC m=+0.211825553 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 03:48:54 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:48:54 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:48:54 localhost podman[101127]: unhealthy Feb 23 03:48:54 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:48:54 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:48:54 localhost podman[101177]: 2026-02-23 08:48:54.633491776 +0000 UTC m=+0.094739445 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=) Feb 23 03:48:54 localhost podman[101128]: 2026-02-23 08:48:54.769423277 +0000 UTC m=+0.403581332 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, container_name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:48:54 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:48:55 localhost podman[101177]: 2026-02-23 08:48:55.007898798 +0000 UTC m=+0.469146527 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git) Feb 23 03:48:55 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:49:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:49:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:49:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:49:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:49:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:49:14 localhost systemd[1]: tmp-crun.AUFeYf.mount: Deactivated successfully. Feb 23 03:49:14 localhost podman[101297]: 2026-02-23 08:49:14.929937253 +0000 UTC m=+0.104252715 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z) Feb 23 03:49:14 localhost podman[101297]: 2026-02-23 08:49:14.971359457 +0000 UTC m=+0.145674879 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible) Feb 23 03:49:14 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:49:19 localhost sshd[101317]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:49:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:49:21 localhost systemd[1]: tmp-crun.pvymrP.mount: Deactivated successfully. Feb 23 03:49:21 localhost podman[101319]: 2026-02-23 08:49:21.916507085 +0000 UTC m=+0.090312456 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, release=1766032510, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:49:21 localhost podman[101327]: 2026-02-23 08:49:21.983091862 +0000 UTC m=+0.138839974 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Feb 23 03:49:22 localhost podman[101328]: 2026-02-23 08:49:22.0459098 +0000 UTC m=+0.200721322 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5) Feb 23 03:49:22 localhost podman[101319]: 2026-02-23 08:49:22.05256666 +0000 UTC m=+0.226372071 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, container_name=iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:34:43Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team) Feb 23 03:49:22 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:49:22 localhost podman[101328]: 2026-02-23 08:49:22.074379616 +0000 UTC m=+0.229191158 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13) Feb 23 03:49:22 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:49:22 localhost podman[101320]: 2026-02-23 08:49:22.091363981 +0000 UTC m=+0.256825339 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 23 03:49:22 localhost podman[101321]: 2026-02-23 08:49:21.951687063 +0000 UTC m=+0.113270989 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, tcib_managed=true) Feb 23 03:49:22 localhost podman[101327]: 2026-02-23 08:49:22.116753262 +0000 UTC m=+0.272501394 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git) Feb 23 03:49:22 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:49:22 localhost podman[101321]: 2026-02-23 08:49:22.135296125 +0000 UTC m=+0.296880001 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team) Feb 23 03:49:22 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:49:22 localhost podman[101320]: 2026-02-23 08:49:22.151364942 +0000 UTC m=+0.316826270 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute) Feb 23 03:49:22 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:49:24 localhost podman[101431]: 2026-02-23 08:49:24.917903713 +0000 UTC m=+0.092351260 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:49:24 localhost systemd[1]: tmp-crun.tmrniE.mount: Deactivated successfully. Feb 23 03:49:24 localhost podman[101433]: 2026-02-23 08:49:24.979132061 +0000 UTC m=+0.148451036 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:49:24 localhost podman[101431]: 2026-02-23 08:49:24.986913677 +0000 UTC m=+0.161361114 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible) Feb 23 03:49:24 localhost podman[101431]: unhealthy Feb 23 03:49:24 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:49:24 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:49:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:49:25 localhost podman[101432]: 2026-02-23 08:49:25.085208462 +0000 UTC m=+0.254056023 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, tcib_managed=true, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Feb 23 03:49:25 localhost podman[101432]: 2026-02-23 08:49:25.130565551 +0000 UTC m=+0.299413102 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1) Feb 23 03:49:25 localhost podman[101432]: unhealthy Feb 23 03:49:25 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:49:25 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:49:25 localhost podman[101433]: 2026-02-23 08:49:25.169312651 +0000 UTC m=+0.338631636 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:49:25 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:49:25 localhost podman[101500]: 2026-02-23 08:49:25.181091262 +0000 UTC m=+0.089497450 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, container_name=nova_migration_target) Feb 23 03:49:25 localhost podman[101500]: 2026-02-23 08:49:25.592496319 +0000 UTC m=+0.500902507 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:49:25 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:49:40 localhost sshd[101525]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:49:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:49:45 localhost systemd[1]: tmp-crun.NXyRKd.mount: Deactivated successfully. Feb 23 03:49:45 localhost podman[101527]: 2026-02-23 08:49:45.918407163 +0000 UTC m=+0.090100415 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510) Feb 23 03:49:45 localhost podman[101527]: 2026-02-23 08:49:45.928403216 +0000 UTC m=+0.100096498 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, release=1766032510, batch=17.1_20260112.1) Feb 23 03:49:45 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:49:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:49:52 localhost podman[101548]: 2026-02-23 08:49:52.945070046 +0000 UTC m=+0.117296108 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, batch=17.1_20260112.1) Feb 23 03:49:52 localhost podman[101548]: 2026-02-23 08:49:52.954079029 +0000 UTC m=+0.126305091 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com) Feb 23 03:49:52 localhost podman[101549]: 2026-02-23 08:49:52.917127011 +0000 UTC m=+0.089126496 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, build-date=2026-01-12T23:07:47Z, version=17.1.13, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Feb 23 03:49:52 localhost systemd[1]: tmp-crun.nZ17D1.mount: Deactivated successfully. Feb 23 03:49:52 localhost podman[101551]: 2026-02-23 08:49:52.985801394 +0000 UTC m=+0.145034988 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, release=1766032510, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 23 03:49:52 localhost podman[101551]: 2026-02-23 08:49:52.993889357 +0000 UTC m=+0.153122971 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, version=17.1.13, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4) Feb 23 03:49:53 localhost podman[101549]: 2026-02-23 08:49:53.003541939 +0000 UTC m=+0.175541474 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team) Feb 23 03:49:53 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:49:53 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:49:53 localhost podman[101550]: 2026-02-23 08:49:53.085960813 +0000 UTC m=+0.251584187 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible) Feb 23 03:49:53 localhost podman[101550]: 2026-02-23 08:49:53.122272411 +0000 UTC m=+0.287895825 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public) Feb 23 03:49:53 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:49:53 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:49:53 localhost podman[101562]: 2026-02-23 08:49:53.20072939 +0000 UTC m=+0.357446805 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.expose-services=, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:49:53 localhost podman[101562]: 2026-02-23 08:49:53.257440719 +0000 UTC m=+0.414158104 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510) Feb 23 03:49:53 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:49:55 localhost podman[101666]: 2026-02-23 08:49:55.92379729 +0000 UTC m=+0.090037804 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=ovn_controller, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, managed_by=tripleo_ansible) Feb 23 03:49:55 localhost podman[101668]: 2026-02-23 08:49:55.973514598 +0000 UTC m=+0.132817914 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5) Feb 23 03:49:55 localhost podman[101666]: 2026-02-23 08:49:55.992779702 +0000 UTC m=+0.159020246 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible) Feb 23 03:49:55 localhost podman[101666]: unhealthy Feb 23 03:49:56 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:49:56 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:49:56 localhost systemd[1]: tmp-crun.MWdEIR.mount: Deactivated successfully. Feb 23 03:49:56 localhost podman[101667]: 2026-02-23 08:49:56.051098471 +0000 UTC m=+0.213163603 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:49:56 localhost podman[101667]: 2026-02-23 08:49:56.065251884 +0000 UTC m=+0.227317046 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1) Feb 23 03:49:56 localhost podman[101667]: unhealthy Feb 23 03:49:56 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:49:56 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:49:56 localhost podman[101665]: 2026-02-23 08:49:56.142265038 +0000 UTC m=+0.312001251 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true) Feb 23 03:49:56 localhost podman[101668]: 2026-02-23 08:49:56.200725101 +0000 UTC m=+0.360028417 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, distribution-scope=public) Feb 23 03:49:56 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:49:56 localhost podman[101665]: 2026-02-23 08:49:56.554939585 +0000 UTC m=+0.724675768 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, container_name=nova_migration_target, batch=17.1_20260112.1) Feb 23 03:49:56 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:49:56 localhost sshd[101757]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:50:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:50:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:50:16 localhost recover_tripleo_nova_virtqemud[101837]: 61982 Feb 23 03:50:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:50:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:50:16 localhost podman[101835]: 2026-02-23 08:50:16.934187549 +0000 UTC m=+0.100742629 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, io.buildah.version=1.41.5, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:50:16 localhost podman[101835]: 2026-02-23 08:50:16.974305907 +0000 UTC m=+0.140860957 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com) Feb 23 03:50:16 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:50:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:50:23 localhost podman[101860]: 2026-02-23 08:50:23.933059082 +0000 UTC m=+0.093002876 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, batch=17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, url=https://www.redhat.com) Feb 23 03:50:23 localhost podman[101860]: 2026-02-23 08:50:23.944601624 +0000 UTC m=+0.104545458 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond) Feb 23 03:50:23 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:50:23 localhost systemd[1]: tmp-crun.IvMKvH.mount: Deactivated successfully. Feb 23 03:50:23 localhost podman[101859]: 2026-02-23 08:50:23.991047629 +0000 UTC m=+0.155708361 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=) Feb 23 03:50:24 localhost podman[101871]: 2026-02-23 08:50:24.025135309 +0000 UTC m=+0.180730547 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:50:24 localhost podman[101857]: 2026-02-23 08:50:24.04496211 +0000 UTC m=+0.215113874 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, version=17.1.13) Feb 23 03:50:24 localhost podman[101859]: 2026-02-23 08:50:24.052713772 +0000 UTC m=+0.217374524 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 23 03:50:24 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:50:24 localhost podman[101857]: 2026-02-23 08:50:24.081595138 +0000 UTC m=+0.251746912 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510) Feb 23 03:50:24 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:50:24 localhost podman[101871]: 2026-02-23 08:50:24.134846908 +0000 UTC m=+0.290442146 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com) Feb 23 03:50:24 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:50:24 localhost podman[101858]: 2026-02-23 08:50:24.1406728 +0000 UTC m=+0.308098089 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:50:24 localhost podman[101858]: 2026-02-23 08:50:24.221987419 +0000 UTC m=+0.389412678 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:47Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4) Feb 23 03:50:24 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:50:26 localhost podman[101973]: 2026-02-23 08:50:26.919168378 +0000 UTC m=+0.090229241 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:50:26 localhost podman[101973]: 2026-02-23 08:50:26.96422981 +0000 UTC m=+0.135290693 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=) Feb 23 03:50:26 localhost podman[101973]: unhealthy Feb 23 03:50:26 localhost systemd[1]: tmp-crun.IDSr7w.mount: Deactivated successfully. Feb 23 03:50:26 localhost podman[101974]: 2026-02-23 08:50:26.980546772 +0000 UTC m=+0.149252361 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc.) Feb 23 03:50:26 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:50:26 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:50:27 localhost podman[101974]: 2026-02-23 08:50:27.019963707 +0000 UTC m=+0.188669276 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:50:27 localhost podman[101974]: unhealthy Feb 23 03:50:27 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:50:27 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:50:27 localhost podman[101975]: 2026-02-23 08:50:27.034390539 +0000 UTC m=+0.198408680 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vendor=Red Hat, Inc.) Feb 23 03:50:27 localhost podman[101972]: 2026-02-23 08:50:27.073498265 +0000 UTC m=+0.247004033 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-type=git, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible) Feb 23 03:50:27 localhost podman[101975]: 2026-02-23 08:50:27.24943417 +0000 UTC m=+0.413452271 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Feb 23 03:50:27 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:50:27 localhost podman[101972]: 2026-02-23 08:50:27.481443712 +0000 UTC m=+0.654949440 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:50:27 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:50:27 localhost sshd[102065]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:50:37 localhost sshd[102067]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:50:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:50:47 localhost podman[102069]: 2026-02-23 08:50:47.909553321 +0000 UTC m=+0.080595988 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:50:47 localhost podman[102069]: 2026-02-23 08:50:47.949419 +0000 UTC m=+0.120461667 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:50:47 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:50:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:50:54 localhost podman[102092]: 2026-02-23 08:50:54.934985575 +0000 UTC m=+0.094866905 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc.) Feb 23 03:50:54 localhost podman[102092]: 2026-02-23 08:50:54.970158378 +0000 UTC m=+0.130039728 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:50:55 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:50:55 localhost podman[102089]: 2026-02-23 08:50:55.021509627 +0000 UTC m=+0.191187504 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, container_name=iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z) Feb 23 03:50:55 localhost podman[102089]: 2026-02-23 08:50:55.036215818 +0000 UTC m=+0.205893755 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., container_name=iscsid, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:50:55 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:50:55 localhost podman[102090]: 2026-02-23 08:50:55.036628102 +0000 UTC m=+0.202960794 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team) Feb 23 03:50:55 localhost podman[102091]: 2026-02-23 08:50:55.093536305 +0000 UTC m=+0.255520760 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64) Feb 23 03:50:55 localhost podman[102097]: 2026-02-23 08:50:55.145198204 +0000 UTC m=+0.301803591 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, tcib_managed=true, vcs-type=git) Feb 23 03:50:55 localhost podman[102091]: 2026-02-23 08:50:55.15559401 +0000 UTC m=+0.317578505 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4) Feb 23 03:50:55 localhost podman[102090]: 2026-02-23 08:50:55.167672679 +0000 UTC m=+0.334005331 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1) Feb 23 03:50:55 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:50:55 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:50:55 localhost podman[102097]: 2026-02-23 08:50:55.181537043 +0000 UTC m=+0.338142470 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Feb 23 03:50:55 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:50:57 localhost podman[102207]: 2026-02-23 08:50:57.928289976 +0000 UTC m=+0.100173152 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team) Feb 23 03:50:58 localhost podman[102210]: 2026-02-23 08:50:58.006946851 +0000 UTC m=+0.164598961 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public) Feb 23 03:50:58 localhost podman[102209]: 2026-02-23 08:50:57.974397681 +0000 UTC m=+0.138731430 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, container_name=ovn_metadata_agent, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z) Feb 23 03:50:58 localhost podman[102208]: 2026-02-23 08:50:58.056368921 +0000 UTC m=+0.221016929 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, container_name=ovn_controller, config_id=tripleo_step4) Feb 23 03:50:58 localhost podman[102208]: 2026-02-23 08:50:58.099362478 +0000 UTC m=+0.264010456 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z) Feb 23 03:50:58 localhost podman[102208]: unhealthy Feb 23 03:50:58 localhost podman[102209]: 2026-02-23 08:50:58.108797984 +0000 UTC m=+0.273131723 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, version=17.1.13, vcs-type=git, io.openshift.expose-services=) Feb 23 03:50:58 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:50:58 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:50:58 localhost podman[102209]: unhealthy Feb 23 03:50:58 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:50:58 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:50:58 localhost podman[102210]: 2026-02-23 08:50:58.214197428 +0000 UTC m=+0.371849488 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, config_id=tripleo_step1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public) Feb 23 03:50:58 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:50:58 localhost podman[102207]: 2026-02-23 08:50:58.363544509 +0000 UTC m=+0.535427735 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target) Feb 23 03:50:58 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:51:17 localhost sshd[102423]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:51:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:51:18 localhost systemd[1]: tmp-crun.H0laqn.mount: Deactivated successfully. Feb 23 03:51:18 localhost podman[102425]: 2026-02-23 08:51:18.460461564 +0000 UTC m=+0.087945647 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:51:18 localhost podman[102425]: 2026-02-23 08:51:18.503468222 +0000 UTC m=+0.130952285 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:51:18 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:51:19 localhost sshd[102445]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:51:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:51:25 localhost podman[102447]: 2026-02-23 08:51:25.983492609 +0000 UTC m=+0.149758826 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, container_name=iscsid, release=1766032510, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-iscsid-container) Feb 23 03:51:26 localhost podman[102447]: 2026-02-23 08:51:26.01862956 +0000 UTC m=+0.184895737 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:51:26 localhost podman[102450]: 2026-02-23 08:51:26.032745142 +0000 UTC m=+0.191275297 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, container_name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-cron-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public) Feb 23 03:51:26 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:51:26 localhost podman[102448]: 2026-02-23 08:51:26.098594437 +0000 UTC m=+0.265427302 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64) Feb 23 03:51:26 localhost podman[102451]: 2026-02-23 08:51:25.952038762 +0000 UTC m=+0.106278743 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true) Feb 23 03:51:26 localhost podman[102450]: 2026-02-23 08:51:26.117634803 +0000 UTC m=+0.276165008 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5) Feb 23 03:51:26 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:51:26 localhost podman[102451]: 2026-02-23 08:51:26.13733025 +0000 UTC m=+0.291570211 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-type=git, container_name=nova_compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true) Feb 23 03:51:26 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:51:26 localhost podman[102449]: 2026-02-23 08:51:26.188464154 +0000 UTC m=+0.351033326 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:07:30Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1) Feb 23 03:51:26 localhost podman[102448]: 2026-02-23 08:51:26.211701332 +0000 UTC m=+0.378534217 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z) Feb 23 03:51:26 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:51:26 localhost podman[102449]: 2026-02-23 08:51:26.244958554 +0000 UTC m=+0.407527716 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:51:26 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:51:26 localhost systemd[1]: tmp-crun.LiCk7Y.mount: Deactivated successfully. Feb 23 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:51:28 localhost podman[102570]: 2026-02-23 08:51:28.929126284 +0000 UTC m=+0.090617691 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:51:28 localhost systemd[1]: tmp-crun.S2K9cQ.mount: Deactivated successfully. Feb 23 03:51:28 localhost podman[102568]: 2026-02-23 08:51:28.988898528 +0000 UTC m=+0.156042892 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, tcib_managed=true, url=https://www.redhat.com) Feb 23 03:51:29 localhost podman[102569]: 2026-02-23 08:51:29.027034314 +0000 UTC m=+0.189260385 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 23 03:51:29 localhost podman[102569]: 2026-02-23 08:51:29.042049284 +0000 UTC m=+0.204275335 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4) Feb 23 03:51:29 localhost podman[102569]: unhealthy Feb 23 03:51:29 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:51:29 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:51:29 localhost podman[102568]: 2026-02-23 08:51:29.057808928 +0000 UTC m=+0.224953272 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, version=17.1.13, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:51:29 localhost podman[102568]: unhealthy Feb 23 03:51:29 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:51:29 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:51:29 localhost podman[102567]: 2026-02-23 08:51:29.129517696 +0000 UTC m=+0.299072246 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:51:29 localhost podman[102570]: 2026-02-23 08:51:29.164707339 +0000 UTC m=+0.326198736 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 23 03:51:29 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:51:29 localhost podman[102567]: 2026-02-23 08:51:29.527136321 +0000 UTC m=+0.696690921 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z) Feb 23 03:51:29 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:51:48 localhost podman[102659]: 2026-02-23 08:51:48.921496783 +0000 UTC m=+0.092137779 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Feb 23 03:51:48 localhost podman[102659]: 2026-02-23 08:51:48.960287278 +0000 UTC m=+0.130928264 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Feb 23 03:51:48 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:51:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:51:56 localhost systemd[1]: Starting dnf makecache... Feb 23 03:51:56 localhost systemd[1]: tmp-crun.Ha1P52.mount: Deactivated successfully. Feb 23 03:51:56 localhost podman[102681]: 2026-02-23 08:51:56.93597607 +0000 UTC m=+0.102399692 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:51:56 localhost podman[102681]: 2026-02-23 08:51:56.987302144 +0000 UTC m=+0.153725826 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:51:56 localhost systemd[1]: tmp-crun.mYwtXG.mount: Deactivated successfully. Feb 23 03:51:57 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:51:57 localhost podman[102680]: 2026-02-23 08:51:57.048308275 +0000 UTC m=+0.216203263 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=iscsid) Feb 23 03:51:57 localhost podman[102683]: 2026-02-23 08:51:57.079737509 +0000 UTC m=+0.238398013 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step4, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 23 03:51:57 localhost podman[102680]: 2026-02-23 08:51:57.087253219 +0000 UTC m=+0.255148177 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Feb 23 03:51:57 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:51:57 localhost podman[102682]: 2026-02-23 08:51:57.000521389 +0000 UTC m=+0.161276927 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4) Feb 23 03:51:57 localhost podman[102683]: 2026-02-23 08:51:57.111075649 +0000 UTC m=+0.269736143 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, version=17.1.13) Feb 23 03:51:57 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:51:57 localhost podman[102682]: 2026-02-23 08:51:57.135374795 +0000 UTC m=+0.296130343 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64) Feb 23 03:51:57 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:51:57 localhost dnf[102695]: Updating Subscription Management repositories. Feb 23 03:51:57 localhost podman[102690]: 2026-02-23 08:51:57.091118037 +0000 UTC m=+0.245304084 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:51:57 localhost podman[102690]: 2026-02-23 08:51:57.224717635 +0000 UTC m=+0.378903612 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:51:57 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:51:58 localhost dnf[102695]: Metadata cache refreshed recently. Feb 23 03:51:59 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Feb 23 03:51:59 localhost systemd[1]: Finished dnf makecache. Feb 23 03:51:59 localhost systemd[1]: dnf-makecache.service: Consumed 2.311s CPU time. Feb 23 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:51:59 localhost podman[102795]: 2026-02-23 08:51:59.343489085 +0000 UTC m=+0.076124516 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4) Feb 23 03:51:59 localhost podman[102794]: 2026-02-23 08:51:59.408294502 +0000 UTC m=+0.139923562 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible) Feb 23 03:51:59 localhost podman[102796]: 2026-02-23 08:51:59.373927418 +0000 UTC m=+0.100232935 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:10:14Z, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:51:59 localhost podman[102795]: 2026-02-23 08:51:59.430616437 +0000 UTC m=+0.163251788 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 23 03:51:59 localhost podman[102795]: unhealthy Feb 23 03:51:59 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:51:59 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:51:59 localhost podman[102794]: 2026-02-23 08:51:59.453275342 +0000 UTC m=+0.184904422 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:51:59 localhost podman[102794]: unhealthy Feb 23 03:51:59 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:51:59 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:51:59 localhost podman[102796]: 2026-02-23 08:51:59.589419217 +0000 UTC m=+0.315724724 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1) Feb 23 03:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:51:59 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:51:59 localhost podman[102863]: 2026-02-23 08:51:59.68897998 +0000 UTC m=+0.070770281 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1) Feb 23 03:52:00 localhost podman[102863]: 2026-02-23 08:52:00.10536118 +0000 UTC m=+0.487151481 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:52:00 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:52:01 localhost sshd[102886]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:52:02 localhost sshd[102888]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:52:17 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:52:17 localhost recover_tripleo_nova_virtqemud[102968]: 61982 Feb 23 03:52:17 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:52:17 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:52:19 localhost podman[102969]: 2026-02-23 08:52:19.926041376 +0000 UTC m=+0.095507080 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:52:19 localhost podman[102969]: 2026-02-23 08:52:19.936823737 +0000 UTC m=+0.106289401 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:52:19 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:52:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:52:27 localhost podman[102989]: 2026-02-23 08:52:27.929172701 +0000 UTC m=+0.101341179 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z) Feb 23 03:52:27 localhost podman[102989]: 2026-02-23 08:52:27.943287283 +0000 UTC m=+0.115455781 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, container_name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:52:27 localhost systemd[1]: tmp-crun.h4aXIi.mount: Deactivated successfully. Feb 23 03:52:27 localhost podman[102990]: 2026-02-23 08:52:27.989854762 +0000 UTC m=+0.158005447 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z) Feb 23 03:52:28 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:52:28 localhost podman[102998]: 2026-02-23 08:52:28.090777807 +0000 UTC m=+0.248670208 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, config_id=tripleo_step5, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:52:28 localhost podman[102990]: 2026-02-23 08:52:28.095332996 +0000 UTC m=+0.263483681 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:52:28 localhost podman[102992]: 2026-02-23 08:52:28.10588394 +0000 UTC m=+0.266633148 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:52:28 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:52:28 localhost podman[102992]: 2026-02-23 08:52:28.118340883 +0000 UTC m=+0.279090031 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:52:28 localhost podman[102998]: 2026-02-23 08:52:28.127341868 +0000 UTC m=+0.285234239 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com) Feb 23 03:52:28 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:52:28 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:52:28 localhost podman[102991]: 2026-02-23 08:52:28.07330404 +0000 UTC m=+0.237324818 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true) Feb 23 03:52:28 localhost podman[102991]: 2026-02-23 08:52:28.207481366 +0000 UTC m=+0.371502124 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git) Feb 23 03:52:28 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:52:28 localhost systemd[1]: tmp-crun.dvkcaG.mount: Deactivated successfully. Feb 23 03:52:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:52:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:52:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:52:29 localhost podman[103103]: 2026-02-23 08:52:29.90609306 +0000 UTC m=+0.078179499 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, tcib_managed=true, container_name=ovn_metadata_agent) Feb 23 03:52:29 localhost podman[103102]: 2026-02-23 08:52:29.958991313 +0000 UTC m=+0.132962010 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:52:29 localhost podman[103103]: 2026-02-23 08:52:29.953437122 +0000 UTC m=+0.125523521 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 23 03:52:29 localhost podman[103102]: 2026-02-23 08:52:29.980216413 +0000 UTC m=+0.154187060 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, architecture=x86_64, container_name=ovn_controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.component=openstack-ovn-controller-container) Feb 23 03:52:29 localhost podman[103102]: unhealthy Feb 23 03:52:29 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:52:29 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:52:30 localhost podman[103103]: unhealthy Feb 23 03:52:30 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:52:30 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:52:30 localhost podman[103104]: 2026-02-23 08:52:30.081042235 +0000 UTC m=+0.246013986 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 23 03:52:30 localhost podman[103104]: 2026-02-23 08:52:30.277415238 +0000 UTC m=+0.442386979 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:52:30 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:52:30 localhost podman[103169]: 2026-02-23 08:52:30.381719526 +0000 UTC m=+0.077614600 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-nova-compute-container, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com) Feb 23 03:52:30 localhost podman[103169]: 2026-02-23 08:52:30.780709233 +0000 UTC m=+0.476604297 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:52:30 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:52:43 localhost sshd[103193]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:52:49 localhost sshd[103195]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:52:50 localhost podman[103197]: 2026-02-23 08:52:50.536177946 +0000 UTC m=+0.097425289 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:52:50 localhost podman[103197]: 2026-02-23 08:52:50.551385452 +0000 UTC m=+0.112632835 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 03:52:50 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:52:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:52:58 localhost systemd[1]: tmp-crun.VMa3gl.mount: Deactivated successfully. Feb 23 03:52:58 localhost podman[103218]: 2026-02-23 08:52:58.966936119 +0000 UTC m=+0.102733771 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:52:59 localhost podman[103217]: 2026-02-23 08:52:59.017427638 +0000 UTC m=+0.155205551 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:52:59 localhost podman[103218]: 2026-02-23 08:52:59.018279754 +0000 UTC m=+0.154077396 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:52:59 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:52:59 localhost podman[103229]: 2026-02-23 08:52:59.075378595 +0000 UTC m=+0.196790716 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Feb 23 03:52:59 localhost podman[103220]: 2026-02-23 08:52:59.123448049 +0000 UTC m=+0.250426390 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron) Feb 23 03:52:59 localhost podman[103220]: 2026-02-23 08:52:59.138378978 +0000 UTC m=+0.265357369 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=logrotate_crond, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, architecture=x86_64) Feb 23 03:52:59 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:52:59 localhost podman[103217]: 2026-02-23 08:52:59.153751999 +0000 UTC m=+0.291529932 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, version=17.1.13, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team) Feb 23 03:52:59 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:52:59 localhost podman[103229]: 2026-02-23 08:52:59.180618763 +0000 UTC m=+0.302030834 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, batch=17.1_20260112.1, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z) Feb 23 03:52:59 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:52:59 localhost podman[103219]: 2026-02-23 08:52:59.215997468 +0000 UTC m=+0.345889939 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4) Feb 23 03:52:59 localhost podman[103219]: 2026-02-23 08:52:59.276436882 +0000 UTC m=+0.406329343 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container) Feb 23 03:52:59 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:52:59 localhost systemd[1]: tmp-crun.z7zB3M.mount: Deactivated successfully. Feb 23 03:53:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:53:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:53:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:53:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:53:00 localhost systemd[1]: tmp-crun.6q3YG5.mount: Deactivated successfully. Feb 23 03:53:00 localhost podman[103333]: 2026-02-23 08:53:00.972005832 +0000 UTC m=+0.143530082 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:53:00 localhost systemd[1]: tmp-crun.aD7s0X.mount: Deactivated successfully. Feb 23 03:53:01 localhost podman[103335]: 2026-02-23 08:53:00.997257777 +0000 UTC m=+0.161416231 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:53:01 localhost podman[103334]: 2026-02-23 08:53:01.038308576 +0000 UTC m=+0.208496876 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:53:01 localhost podman[103334]: 2026-02-23 08:53:01.078021564 +0000 UTC m=+0.248209904 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com) Feb 23 03:53:01 localhost podman[103334]: unhealthy Feb 23 03:53:01 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:53:01 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:53:01 localhost podman[103336]: 2026-02-23 08:53:01.079487678 +0000 UTC m=+0.240828656 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 23 03:53:01 localhost podman[103335]: 2026-02-23 08:53:01.135527908 +0000 UTC m=+0.299686282 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true) Feb 23 03:53:01 localhost podman[103335]: unhealthy Feb 23 03:53:01 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:53:01 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:53:01 localhost podman[103336]: 2026-02-23 08:53:01.272346143 +0000 UTC m=+0.433687101 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z) Feb 23 03:53:01 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:53:01 localhost podman[103333]: 2026-02-23 08:53:01.362298762 +0000 UTC m=+0.533822972 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container) Feb 23 03:53:01 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:53:05 localhost systemd[1]: tmp-crun.EyVt2I.mount: Deactivated successfully. Feb 23 03:53:05 localhost podman[103519]: 2026-02-23 08:53:05.500062032 +0000 UTC m=+0.103264997 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True) Feb 23 03:53:05 localhost podman[103519]: 2026-02-23 08:53:05.60431908 +0000 UTC m=+0.207522015 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, version=7, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Feb 23 03:53:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=46644 SEQ=0 ACK=4038960630 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:53:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:53:20 localhost podman[103664]: 2026-02-23 08:53:20.92775361 +0000 UTC m=+0.096605274 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64) Feb 23 03:53:20 localhost podman[103664]: 2026-02-23 08:53:20.941051097 +0000 UTC m=+0.109902761 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5) Feb 23 03:53:20 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:53:24 localhost sshd[103685]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:53:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:53:29 localhost systemd[1]: tmp-crun.QhMp8p.mount: Deactivated successfully. Feb 23 03:53:29 localhost systemd[1]: tmp-crun.r0vSpG.mount: Deactivated successfully. Feb 23 03:53:29 localhost podman[103688]: 2026-02-23 08:53:29.932733232 +0000 UTC m=+0.100164253 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, architecture=x86_64) Feb 23 03:53:29 localhost podman[103696]: 2026-02-23 08:53:29.942791951 +0000 UTC m=+0.100616927 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true) Feb 23 03:53:29 localhost podman[103688]: 2026-02-23 08:53:29.957182642 +0000 UTC m=+0.124613603 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Feb 23 03:53:29 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:53:29 localhost podman[103696]: 2026-02-23 08:53:29.9874356 +0000 UTC m=+0.145260596 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:32:04Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1) Feb 23 03:53:29 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:53:30 localhost podman[103687]: 2026-02-23 08:53:29.911905333 +0000 UTC m=+0.083022767 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.buildah.version=1.41.5) Feb 23 03:53:30 localhost podman[103690]: 2026-02-23 08:53:30.02362395 +0000 UTC m=+0.182897281 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=) Feb 23 03:53:30 localhost podman[103687]: 2026-02-23 08:53:30.044249922 +0000 UTC m=+0.215367376 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, distribution-scope=public, version=17.1.13) Feb 23 03:53:30 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:53:30 localhost podman[103689]: 2026-02-23 08:53:30.087449316 +0000 UTC m=+0.250628907 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4) Feb 23 03:53:30 localhost podman[103690]: 2026-02-23 08:53:30.109646047 +0000 UTC m=+0.268919438 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:53:30 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:53:30 localhost podman[103689]: 2026-02-23 08:53:30.137725468 +0000 UTC m=+0.300905059 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5) Feb 23 03:53:30 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:53:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:53:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:53:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:53:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:53:31 localhost podman[103803]: 2026-02-23 08:53:31.932883283 +0000 UTC m=+0.095083857 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:53:31 localhost podman[103803]: 2026-02-23 08:53:31.944712266 +0000 UTC m=+0.106912910 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 23 03:53:31 localhost podman[103803]: unhealthy Feb 23 03:53:31 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:53:31 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:53:31 localhost podman[103802]: 2026-02-23 08:53:31.993182123 +0000 UTC m=+0.154921403 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 03:53:32 localhost podman[103802]: 2026-02-23 08:53:32.005795369 +0000 UTC m=+0.167534609 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git) Feb 23 03:53:32 localhost podman[103802]: unhealthy Feb 23 03:53:32 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:53:32 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:53:32 localhost podman[103801]: 2026-02-23 08:53:32.149224028 +0000 UTC m=+0.318101016 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1) Feb 23 03:53:32 localhost podman[103805]: 2026-02-23 08:53:32.064808679 +0000 UTC m=+0.219956416 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git) Feb 23 03:53:32 localhost podman[103805]: 2026-02-23 08:53:32.262374769 +0000 UTC m=+0.417522576 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, config_id=tripleo_step1) Feb 23 03:53:32 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:53:32 localhost podman[103801]: 2026-02-23 08:53:32.538418154 +0000 UTC m=+0.707295172 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.) Feb 23 03:53:32 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:53:32 localhost systemd[1]: tmp-crun.Dd0QJg.mount: Deactivated successfully. Feb 23 03:53:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:19:01:95 MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.104 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=5668 DPT=57016 SEQ=0 ACK=1742402586 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:53:41 localhost sshd[103893]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:53:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:53:51 localhost systemd[1]: tmp-crun.yPevok.mount: Deactivated successfully. Feb 23 03:53:51 localhost podman[103895]: 2026-02-23 08:53:51.919358553 +0000 UTC m=+0.094907712 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64) Feb 23 03:53:51 localhost podman[103895]: 2026-02-23 08:53:51.934245449 +0000 UTC m=+0.109794648 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-collectd-container, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:53:51 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:53:57 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:53:57 localhost recover_tripleo_nova_virtqemud[103917]: 61982 Feb 23 03:53:57 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:53:57 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:54:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:54:00 localhost systemd[1]: tmp-crun.xYSrKE.mount: Deactivated successfully. Feb 23 03:54:00 localhost podman[103918]: 2026-02-23 08:54:00.92618063 +0000 UTC m=+0.099076020 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3) Feb 23 03:54:00 localhost podman[103918]: 2026-02-23 08:54:00.933910077 +0000 UTC m=+0.106805417 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, tcib_managed=true, container_name=iscsid, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Feb 23 03:54:00 localhost podman[103919]: 2026-02-23 08:54:00.97836812 +0000 UTC m=+0.149110374 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1) Feb 23 03:54:01 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:54:01 localhost podman[103932]: 2026-02-23 08:54:01.026379723 +0000 UTC m=+0.183788307 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:54:01 localhost podman[103920]: 2026-02-23 08:54:00.954258711 +0000 UTC m=+0.118625469 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step4, release=1766032510, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:54:01 localhost podman[103932]: 2026-02-23 08:54:01.494109627 +0000 UTC m=+0.651518271 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, container_name=nova_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step5, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:54:01 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:54:01 localhost podman[103919]: 2026-02-23 08:54:01.541669456 +0000 UTC m=+0.712411700 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container) Feb 23 03:54:01 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:54:01 localhost podman[103921]: 2026-02-23 08:54:01.629546362 +0000 UTC m=+0.794021114 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond) Feb 23 03:54:01 localhost podman[103921]: 2026-02-23 08:54:01.639669331 +0000 UTC m=+0.804144123 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5) Feb 23 03:54:01 localhost podman[103920]: 2026-02-23 08:54:01.647255844 +0000 UTC m=+0.811622632 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:54:01 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:54:01 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:54:01 localhost systemd[1]: tmp-crun.KPbhum.mount: Deactivated successfully. Feb 23 03:54:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:54:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:54:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:54:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:54:03 localhost podman[104035]: 2026-02-23 08:54:03.059067063 +0000 UTC m=+0.066387947 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:54:03 localhost podman[104037]: 2026-02-23 08:54:03.041925857 +0000 UTC m=+0.051855021 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5) Feb 23 03:54:03 localhost podman[104036]: 2026-02-23 08:54:03.093262382 +0000 UTC m=+0.103293139 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team) Feb 23 03:54:03 localhost podman[104037]: 2026-02-23 08:54:03.120143706 +0000 UTC m=+0.130072850 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, release=1766032510, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 03:54:03 localhost podman[104037]: unhealthy Feb 23 03:54:03 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:54:03 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:54:03 localhost podman[104036]: 2026-02-23 08:54:03.176195135 +0000 UTC m=+0.186225902 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:54:03 localhost podman[104036]: unhealthy Feb 23 03:54:03 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:54:03 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:54:03 localhost podman[104038]: 2026-02-23 08:54:03.219037179 +0000 UTC m=+0.222757802 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Feb 23 03:54:03 localhost podman[104038]: 2026-02-23 08:54:03.386206944 +0000 UTC m=+0.389927567 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc.) Feb 23 03:54:03 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:54:03 localhost podman[104035]: 2026-02-23 08:54:03.423141628 +0000 UTC m=+0.430462512 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:54:03 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:54:09 localhost sshd[104126]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:54:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:54:22 localhost podman[104206]: 2026-02-23 08:54:22.910100246 +0000 UTC m=+0.084617055 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, container_name=collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:54:22 localhost podman[104206]: 2026-02-23 08:54:22.924555008 +0000 UTC m=+0.099071807 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, config_id=tripleo_step3, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:54:22 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:54:29 localhost sshd[104226]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:54:31 localhost systemd[1]: tmp-crun.TAUEkQ.mount: Deactivated successfully. Feb 23 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:54:31 localhost podman[104228]: 2026-02-23 08:54:31.684286108 +0000 UTC m=+0.144593189 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:54:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:54:31 localhost podman[104229]: 2026-02-23 08:54:31.655737676 +0000 UTC m=+0.114550860 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, vcs-type=git, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5) Feb 23 03:54:31 localhost podman[104228]: 2026-02-23 08:54:31.72085824 +0000 UTC m=+0.181165321 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:54:31 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:54:31 localhost podman[104259]: 2026-02-23 08:54:31.774961152 +0000 UTC m=+0.106982995 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Feb 23 03:54:31 localhost podman[104279]: 2026-02-23 08:54:31.790773335 +0000 UTC m=+0.085828812 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:54:31 localhost podman[104229]: 2026-02-23 08:54:31.794072758 +0000 UTC m=+0.252885962 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.openshift.expose-services=, container_name=nova_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510) Feb 23 03:54:31 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:54:31 localhost podman[104279]: 2026-02-23 08:54:31.821678781 +0000 UTC m=+0.116737158 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1) Feb 23 03:54:31 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:54:31 localhost podman[104259]: 2026-02-23 08:54:31.833114608 +0000 UTC m=+0.165136391 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible) Feb 23 03:54:31 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:54:31 localhost podman[104280]: 2026-02-23 08:54:31.878004121 +0000 UTC m=+0.172714757 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, release=1766032510, vcs-type=git, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 03:54:31 localhost podman[104280]: 2026-02-23 08:54:31.888283262 +0000 UTC m=+0.182993848 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, container_name=logrotate_crond, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true) Feb 23 03:54:31 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:54:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:54:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:54:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:54:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:54:33 localhost podman[104347]: 2026-02-23 08:54:33.930343969 +0000 UTC m=+0.099697116 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:54:33 localhost systemd[1]: tmp-crun.Dx24rU.mount: Deactivated successfully. Feb 23 03:54:33 localhost podman[104348]: 2026-02-23 08:54:33.98127204 +0000 UTC m=+0.145765795 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13) Feb 23 03:54:34 localhost podman[104347]: 2026-02-23 08:54:34.063624394 +0000 UTC m=+0.232977531 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, vcs-type=git) Feb 23 03:54:34 localhost podman[104347]: unhealthy Feb 23 03:54:34 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:54:34 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:54:34 localhost podman[104346]: 2026-02-23 08:54:34.079690895 +0000 UTC m=+0.250114236 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, container_name=ovn_controller, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:54:34 localhost podman[104345]: 2026-02-23 08:54:34.035796715 +0000 UTC m=+0.208938660 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:54:34 localhost podman[104346]: 2026-02-23 08:54:34.129579484 +0000 UTC m=+0.300002825 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, release=1766032510, version=17.1.13, batch=17.1_20260112.1) Feb 23 03:54:34 localhost podman[104346]: unhealthy Feb 23 03:54:34 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:54:34 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:54:34 localhost podman[104348]: 2026-02-23 08:54:34.185098469 +0000 UTC m=+0.349592204 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:54:34 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:54:34 localhost podman[104345]: 2026-02-23 08:54:34.418548004 +0000 UTC m=+0.591689919 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:54:34 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:54:34 localhost systemd[1]: tmp-crun.PnlzZf.mount: Deactivated successfully. Feb 23 03:54:42 localhost sshd[104431]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:54:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:54:53 localhost systemd[1]: tmp-crun.JhWjW3.mount: Deactivated successfully. Feb 23 03:54:53 localhost podman[104433]: 2026-02-23 08:54:53.927620494 +0000 UTC m=+0.098223301 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true) Feb 23 03:54:53 localhost podman[104433]: 2026-02-23 08:54:53.96366932 +0000 UTC m=+0.134272127 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:54:53 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:55:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:55:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:55:01 localhost podman[104454]: 2026-02-23 08:55:01.910157671 +0000 UTC m=+0.078592537 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container) Feb 23 03:55:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:55:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:55:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:55:01 localhost podman[104454]: 2026-02-23 08:55:01.946657561 +0000 UTC m=+0.115092377 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step5) Feb 23 03:55:01 localhost podman[104453]: 2026-02-23 08:55:01.95780281 +0000 UTC m=+0.133472242 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, release=1766032510, com.redhat.component=openstack-iscsid-container, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5) Feb 23 03:55:01 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:55:01 localhost podman[104453]: 2026-02-23 08:55:01.996131056 +0000 UTC m=+0.171800458 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510) Feb 23 03:55:02 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:55:02 localhost podman[104487]: 2026-02-23 08:55:02.02149374 +0000 UTC m=+0.093761221 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z) Feb 23 03:55:02 localhost podman[104489]: 2026-02-23 08:55:02.07302326 +0000 UTC m=+0.138175508 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 23 03:55:02 localhost podman[104489]: 2026-02-23 08:55:02.087234534 +0000 UTC m=+0.152386712 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-cron-container, tcib_managed=true, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 23 03:55:02 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:55:02 localhost podman[104488]: 2026-02-23 08:55:02.135684717 +0000 UTC m=+0.202716224 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:55:02 localhost podman[104487]: 2026-02-23 08:55:02.153475344 +0000 UTC m=+0.225742855 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:55:02 localhost podman[104488]: 2026-02-23 08:55:02.167663157 +0000 UTC m=+0.234694704 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:55:02 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:55:02 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:55:02 localhost sshd[104571]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:55:02 localhost systemd[1]: tmp-crun.mjc5Nk.mount: Deactivated successfully. Feb 23 03:55:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:55:04 localhost podman[104573]: 2026-02-23 08:55:04.214029939 +0000 UTC m=+0.082605602 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 23 03:55:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:55:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:55:04 localhost podman[104573]: 2026-02-23 08:55:04.263377561 +0000 UTC m=+0.131953224 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64) Feb 23 03:55:04 localhost podman[104573]: unhealthy Feb 23 03:55:04 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:55:04 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:55:04 localhost systemd[1]: tmp-crun.vNNAAI.mount: Deactivated successfully. Feb 23 03:55:04 localhost podman[104596]: 2026-02-23 08:55:04.357972957 +0000 UTC m=+0.119970970 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 23 03:55:04 localhost podman[104595]: 2026-02-23 08:55:04.32830595 +0000 UTC m=+0.089723945 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13) Feb 23 03:55:04 localhost podman[104595]: 2026-02-23 08:55:04.412406427 +0000 UTC m=+0.173824442 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 23 03:55:04 localhost podman[104595]: unhealthy Feb 23 03:55:04 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:55:04 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:55:04 localhost podman[104596]: 2026-02-23 08:55:04.558261295 +0000 UTC m=+0.320259278 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1) Feb 23 03:55:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:55:04 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:55:04 localhost podman[104642]: 2026-02-23 08:55:04.671506294 +0000 UTC m=+0.085167053 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true) Feb 23 03:55:05 localhost podman[104642]: 2026-02-23 08:55:05.010198716 +0000 UTC m=+0.423859415 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64) Feb 23 03:55:05 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:55:13 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:55:14 localhost recover_tripleo_nova_virtqemud[104682]: 61982 Feb 23 03:55:14 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:55:14 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:55:18 localhost sshd[104746]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:55:20 localhost sshd[104748]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:55:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:55:24 localhost podman[104750]: 2026-02-23 08:55:24.938182455 +0000 UTC m=+0.103779063 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true) Feb 23 03:55:24 localhost podman[104750]: 2026-02-23 08:55:24.977256776 +0000 UTC m=+0.142853344 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, container_name=collectd, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:55:24 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:55:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:55:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:55:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:55:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:55:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:55:32 localhost podman[104771]: 2026-02-23 08:55:32.936437351 +0000 UTC m=+0.099445258 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1766032510, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true) Feb 23 03:55:32 localhost podman[104771]: 2026-02-23 08:55:32.97480889 +0000 UTC m=+0.137816827 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64) Feb 23 03:55:32 localhost systemd[1]: tmp-crun.2sgnoy.mount: Deactivated successfully. Feb 23 03:55:33 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:55:33 localhost podman[104772]: 2026-02-23 08:55:33.039661557 +0000 UTC m=+0.199399072 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z) Feb 23 03:55:33 localhost podman[104770]: 2026-02-23 08:55:33.006184741 +0000 UTC m=+0.171937343 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 03:55:33 localhost podman[104770]: 2026-02-23 08:55:33.085775597 +0000 UTC m=+0.251528169 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team) Feb 23 03:55:33 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:55:33 localhost podman[104772]: 2026-02-23 08:55:33.105786253 +0000 UTC m=+0.265523708 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Feb 23 03:55:33 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:55:33 localhost podman[104773]: 2026-02-23 08:55:33.106585368 +0000 UTC m=+0.262503614 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public) Feb 23 03:55:33 localhost podman[104773]: 2026-02-23 08:55:33.190431738 +0000 UTC m=+0.346349974 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.13, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:55:33 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:55:33 localhost podman[104782]: 2026-02-23 08:55:33.195291069 +0000 UTC m=+0.348739787 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, release=1766032510, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:55:33 localhost podman[104782]: 2026-02-23 08:55:33.275731773 +0000 UTC m=+0.429180451 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public) Feb 23 03:55:33 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:55:33 localhost systemd[1]: tmp-crun.MssDk5.mount: Deactivated successfully. Feb 23 03:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:55:34 localhost podman[104890]: 2026-02-23 08:55:34.905333372 +0000 UTC m=+0.075942004 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=) Feb 23 03:55:34 localhost podman[104888]: 2026-02-23 08:55:34.96608569 +0000 UTC m=+0.140826732 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.13, release=1766032510, container_name=ovn_controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1) Feb 23 03:55:34 localhost podman[104888]: 2026-02-23 08:55:34.980370557 +0000 UTC m=+0.155111569 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=) Feb 23 03:55:34 localhost podman[104888]: unhealthy Feb 23 03:55:34 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:55:34 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:55:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:55:35 localhost podman[104889]: 2026-02-23 08:55:35.081190007 +0000 UTC m=+0.253238175 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:55:35 localhost podman[104889]: 2026-02-23 08:55:35.098274781 +0000 UTC m=+0.270323019 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com) Feb 23 03:55:35 localhost podman[104889]: unhealthy Feb 23 03:55:35 localhost podman[104890]: 2026-02-23 08:55:35.106541429 +0000 UTC m=+0.277150061 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:55:35 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:55:35 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:55:35 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:55:35 localhost podman[104954]: 2026-02-23 08:55:35.176609378 +0000 UTC m=+0.082589732 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1) Feb 23 03:55:35 localhost podman[104954]: 2026-02-23 08:55:35.573274592 +0000 UTC m=+0.479254926 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=nova_migration_target, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4) Feb 23 03:55:35 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:55:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:55:55 localhost systemd[1]: tmp-crun.xwTuyG.mount: Deactivated successfully. Feb 23 03:55:55 localhost podman[104979]: 2026-02-23 08:55:55.933967593 +0000 UTC m=+0.097037892 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, container_name=collectd, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:55:55 localhost podman[104979]: 2026-02-23 08:55:55.970390761 +0000 UTC m=+0.133461020 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:55:55 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:55:57 localhost sshd[105000]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:56:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:56:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:56:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:56:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:56:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:56:03 localhost podman[105003]: 2026-02-23 08:56:03.929961511 +0000 UTC m=+0.096854128 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, release=1766032510, vcs-type=git, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:56:03 localhost systemd[1]: tmp-crun.oz2MyW.mount: Deactivated successfully. Feb 23 03:56:04 localhost podman[105005]: 2026-02-23 08:56:03.999807333 +0000 UTC m=+0.157618347 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:56:04 localhost podman[105003]: 2026-02-23 08:56:04.015683558 +0000 UTC m=+0.182576215 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, release=1766032510, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, version=17.1.13) Feb 23 03:56:04 localhost podman[105005]: 2026-02-23 08:56:04.036386596 +0000 UTC m=+0.194197550 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Feb 23 03:56:04 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:56:04 localhost podman[105011]: 2026-02-23 08:56:03.949513322 +0000 UTC m=+0.105001063 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T23:32:04Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:56:04 localhost podman[105011]: 2026-02-23 08:56:04.07974611 +0000 UTC m=+0.235233851 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510) Feb 23 03:56:04 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:56:04 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:56:04 localhost podman[105004]: 2026-02-23 08:56:04.040336129 +0000 UTC m=+0.202551260 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 23 03:56:04 localhost podman[105002]: 2026-02-23 08:56:04.147185897 +0000 UTC m=+0.316728927 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:56:04 localhost podman[105002]: 2026-02-23 08:56:04.161391951 +0000 UTC m=+0.330935021 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:56:04 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:56:04 localhost podman[105004]: 2026-02-23 08:56:04.17733346 +0000 UTC m=+0.339548571 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Feb 23 03:56:04 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:56:04 localhost sshd[105118]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:56:04 localhost sshd[105120]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:56:05 localhost sshd[105122]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:56:05 localhost podman[105123]: 2026-02-23 08:56:05.32371223 +0000 UTC m=+0.087104763 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2026-01-12T22:36:40Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Feb 23 03:56:05 localhost podman[105124]: 2026-02-23 08:56:05.385287994 +0000 UTC m=+0.146296182 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Feb 23 03:56:05 localhost podman[105123]: 2026-02-23 08:56:05.395331908 +0000 UTC m=+0.158724451 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, container_name=ovn_controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64) Feb 23 03:56:05 localhost podman[105123]: unhealthy Feb 23 03:56:05 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:56:05 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:56:05 localhost podman[105124]: 2026-02-23 08:56:05.421937819 +0000 UTC m=+0.182946007 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public) Feb 23 03:56:05 localhost podman[105124]: unhealthy Feb 23 03:56:05 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:56:05 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:56:05 localhost podman[105125]: 2026-02-23 08:56:05.485533566 +0000 UTC m=+0.243565731 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:56:05 localhost podman[105125]: 2026-02-23 08:56:05.686424534 +0000 UTC m=+0.444456709 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible) Feb 23 03:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:56:05 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:56:05 localhost podman[105191]: 2026-02-23 08:56:05.79608578 +0000 UTC m=+0.084197062 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, distribution-scope=public) Feb 23 03:56:06 localhost podman[105191]: 2026-02-23 08:56:06.243238442 +0000 UTC m=+0.531349714 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 03:56:06 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:56:26 localhost kernel: DROPPING: IN=vlan20 OUT= MACSRC=8a:7d:df:61:60:7d MACDST=f6:9b:d3:a0:7a:ad MACPROTO=0800 SRC=172.17.0.104 DST=172.17.0.106 LEN=40 TOS=0x00 PREC=0xC0 TTL=64 ID=0 DF PROTO=TCP SPT=6642 DPT=40168 SEQ=0 ACK=3692768849 WINDOW=0 RES=0x00 ACK RST URGP=0 Feb 23 03:56:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:56:26 localhost podman[105291]: 2026-02-23 08:56:26.921929115 +0000 UTC m=+0.091632761 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, tcib_managed=true) Feb 23 03:56:26 localhost podman[105291]: 2026-02-23 08:56:26.936181786 +0000 UTC m=+0.105885472 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, container_name=collectd, release=1766032510, vcs-type=git, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc.) Feb 23 03:56:26 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:56:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:56:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:56:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:56:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:56:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:56:34 localhost podman[105312]: 2026-02-23 08:56:34.93317931 +0000 UTC m=+0.099769723 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:56:34 localhost podman[105313]: 2026-02-23 08:56:34.978178364 +0000 UTC m=+0.140652858 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:56:35 localhost podman[105311]: 2026-02-23 08:56:35.028858674 +0000 UTC m=+0.197565322 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, batch=17.1_20260112.1, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 23 03:56:35 localhost podman[105313]: 2026-02-23 08:56:35.035304454 +0000 UTC m=+0.197778908 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64) Feb 23 03:56:35 localhost podman[105311]: 2026-02-23 08:56:35.042288091 +0000 UTC m=+0.210994709 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, container_name=iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team) Feb 23 03:56:35 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:56:35 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:56:35 localhost podman[105312]: 2026-02-23 08:56:35.09617033 +0000 UTC m=+0.262760743 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step4) Feb 23 03:56:35 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:56:35 localhost podman[105327]: 2026-02-23 08:56:35.108502092 +0000 UTC m=+0.260495432 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step5, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:56:35 localhost podman[105327]: 2026-02-23 08:56:35.18431431 +0000 UTC m=+0.336307650 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, config_id=tripleo_step5, container_name=nova_compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z) Feb 23 03:56:35 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:56:35 localhost podman[105314]: 2026-02-23 08:56:35.198928664 +0000 UTC m=+0.354824205 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com) Feb 23 03:56:35 localhost podman[105314]: 2026-02-23 08:56:35.212150993 +0000 UTC m=+0.368046534 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:56:35 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:56:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:56:35 localhost podman[105432]: 2026-02-23 08:56:35.912282135 +0000 UTC m=+0.082461115 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, container_name=metrics_qdr, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:56:35 localhost systemd[1]: tmp-crun.nofmvD.mount: Deactivated successfully. Feb 23 03:56:35 localhost podman[105431]: 2026-02-23 08:56:35.973990127 +0000 UTC m=+0.146440028 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:56:36 localhost podman[105431]: 2026-02-23 08:56:36.021162269 +0000 UTC m=+0.193612140 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13) Feb 23 03:56:36 localhost podman[105431]: unhealthy Feb 23 03:56:36 localhost podman[105430]: 2026-02-23 08:56:36.028841027 +0000 UTC m=+0.203918599 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:56:36 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:56:36 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:56:36 localhost podman[105430]: 2026-02-23 08:56:36.073285364 +0000 UTC m=+0.248362906 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, container_name=ovn_controller, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:56:36 localhost podman[105430]: unhealthy Feb 23 03:56:36 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:56:36 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:56:36 localhost sshd[105496]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:56:36 localhost podman[105432]: 2026-02-23 08:56:36.116306157 +0000 UTC m=+0.286485177 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=metrics_qdr, vcs-type=git, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 23 03:56:36 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:56:36 localhost podman[105498]: 2026-02-23 08:56:36.727888905 +0000 UTC m=+0.087273145 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:56:37 localhost podman[105498]: 2026-02-23 08:56:37.115536536 +0000 UTC m=+0.474920786 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git) Feb 23 03:56:37 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:56:48 localhost sshd[105521]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:56:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:56:57 localhost podman[105523]: 2026-02-23 08:56:57.942616036 +0000 UTC m=+0.085436998 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, container_name=collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=) Feb 23 03:56:57 localhost podman[105523]: 2026-02-23 08:56:57.95046562 +0000 UTC m=+0.093286602 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-type=git, version=17.1.13, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:56:57 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:57:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:57:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:57:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:57:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:57:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:57:05 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:57:05 localhost recover_tripleo_nova_virtqemud[105575]: 61982 Feb 23 03:57:05 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:57:05 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:57:05 localhost systemd[1]: tmp-crun.GC6hPE.mount: Deactivated successfully. Feb 23 03:57:05 localhost podman[105543]: 2026-02-23 08:57:05.920044282 +0000 UTC m=+0.083893610 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:57:05 localhost podman[105551]: 2026-02-23 08:57:05.99195034 +0000 UTC m=+0.144571880 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, distribution-scope=public, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=) Feb 23 03:57:06 localhost podman[105543]: 2026-02-23 08:57:06.0012923 +0000 UTC m=+0.165141638 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, container_name=ceilometer_agent_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, config_id=tripleo_step4, managed_by=tripleo_ansible) Feb 23 03:57:06 localhost podman[105545]: 2026-02-23 08:57:05.961182427 +0000 UTC m=+0.118984028 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, container_name=logrotate_crond, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:57:06 localhost podman[105544]: 2026-02-23 08:57:06.019475132 +0000 UTC m=+0.179708328 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:57:06 localhost podman[105545]: 2026-02-23 08:57:06.041901977 +0000 UTC m=+0.199703578 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:57:06 localhost podman[105551]: 2026-02-23 08:57:06.048183263 +0000 UTC m=+0.200804773 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=) Feb 23 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:57:06 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:57:06 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:57:06 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:57:06 localhost podman[105542]: 2026-02-23 08:57:06.090940447 +0000 UTC m=+0.258619323 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:57:06 localhost podman[105544]: 2026-02-23 08:57:06.094982582 +0000 UTC m=+0.255215828 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc.) Feb 23 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:57:06 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:57:06 localhost podman[105542]: 2026-02-23 08:57:06.151810833 +0000 UTC m=+0.319489719 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z) Feb 23 03:57:06 localhost podman[105649]: 2026-02-23 08:57:06.162812844 +0000 UTC m=+0.092957242 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:57:06 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:57:06 localhost podman[105649]: 2026-02-23 08:57:06.181177833 +0000 UTC m=+0.111322201 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true) Feb 23 03:57:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:57:06 localhost podman[105649]: unhealthy Feb 23 03:57:06 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:57:06 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:57:06 localhost podman[105669]: 2026-02-23 08:57:06.247524049 +0000 UTC m=+0.129297467 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 03:57:06 localhost podman[105669]: 2026-02-23 08:57:06.26661345 +0000 UTC m=+0.148386858 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller) Feb 23 03:57:06 localhost podman[105669]: unhealthy Feb 23 03:57:06 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:57:06 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:57:06 localhost podman[105689]: 2026-02-23 08:57:06.342183552 +0000 UTC m=+0.143934171 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:57:06 localhost podman[105689]: 2026-02-23 08:57:06.573469147 +0000 UTC m=+0.375219806 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:57:06 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:57:06 localhost systemd[1]: tmp-crun.v8tlsG.mount: Deactivated successfully. Feb 23 03:57:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:57:07 localhost podman[105728]: 2026-02-23 08:57:07.938095217 +0000 UTC m=+0.108559253 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:57:08 localhost podman[105728]: 2026-02-23 08:57:08.354564671 +0000 UTC m=+0.525028657 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute) Feb 23 03:57:08 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:57:13 localhost sshd[105751]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:57:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:57:28 localhost podman[105829]: 2026-02-23 08:57:28.924079807 +0000 UTC m=+0.096127610 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Feb 23 03:57:28 localhost podman[105829]: 2026-02-23 08:57:28.966316245 +0000 UTC m=+0.138363998 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5) Feb 23 03:57:28 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:57:31 localhost sshd[105849]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:57:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:57:36 localhost systemd[1]: tmp-crun.fiXAnD.mount: Deactivated successfully. Feb 23 03:57:36 localhost podman[105852]: 2026-02-23 08:57:36.934510365 +0000 UTC m=+0.101080152 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5) Feb 23 03:57:36 localhost systemd[1]: tmp-crun.Ur6jRp.mount: Deactivated successfully. Feb 23 03:57:36 localhost podman[105866]: 2026-02-23 08:57:36.990378226 +0000 UTC m=+0.139897275 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, container_name=logrotate_crond, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public) Feb 23 03:57:37 localhost podman[105856]: 2026-02-23 08:57:37.010239352 +0000 UTC m=+0.156823740 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Feb 23 03:57:37 localhost podman[105860]: 2026-02-23 08:57:36.964696831 +0000 UTC m=+0.117429210 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13) Feb 23 03:57:37 localhost podman[105852]: 2026-02-23 08:57:37.018393124 +0000 UTC m=+0.184962941 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:57:37 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:57:37 localhost podman[105851]: 2026-02-23 08:57:36.981275185 +0000 UTC m=+0.152571109 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4) Feb 23 03:57:37 localhost podman[105860]: 2026-02-23 08:57:37.04373863 +0000 UTC m=+0.196471089 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 03:57:37 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:57:37 localhost podman[105851]: 2026-02-23 08:57:37.060247311 +0000 UTC m=+0.231543225 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, distribution-scope=public, batch=17.1_20260112.1) Feb 23 03:57:37 localhost podman[105851]: unhealthy Feb 23 03:57:37 localhost podman[105866]: 2026-02-23 08:57:37.073551754 +0000 UTC m=+0.223070803 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, release=1766032510, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public) Feb 23 03:57:37 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:57:37 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:57:37 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:57:37 localhost podman[105856]: 2026-02-23 08:57:37.098419094 +0000 UTC m=+0.245003532 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f) Feb 23 03:57:37 localhost podman[105856]: unhealthy Feb 23 03:57:37 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:57:37 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:57:37 localhost podman[105873]: 2026-02-23 08:57:37.119842737 +0000 UTC m=+0.252710520 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=) Feb 23 03:57:37 localhost podman[105872]: 2026-02-23 08:57:37.142468618 +0000 UTC m=+0.294578357 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:57:37 localhost podman[105872]: 2026-02-23 08:57:37.201222459 +0000 UTC m=+0.353332248 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=nova_compute, tcib_managed=true, vcs-type=git, url=https://www.redhat.com) Feb 23 03:57:37 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Deactivated successfully. Feb 23 03:57:37 localhost podman[105853]: 2026-02-23 08:57:37.203795169 +0000 UTC m=+0.367737485 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.) Feb 23 03:57:37 localhost podman[105853]: 2026-02-23 08:57:37.282495867 +0000 UTC m=+0.446438153 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:57:37 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:57:37 localhost podman[105873]: 2026-02-23 08:57:37.361237746 +0000 UTC m=+0.494105609 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=metrics_qdr, release=1766032510, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee) Feb 23 03:57:37 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:57:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:57:38 localhost podman[106029]: 2026-02-23 08:57:38.906483123 +0000 UTC m=+0.082293570 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, batch=17.1_20260112.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, build-date=2026-01-12T23:32:04Z, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 03:57:39 localhost podman[106029]: 2026-02-23 08:57:39.312833993 +0000 UTC m=+0.488644400 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Feb 23 03:57:39 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:57:52 localhost sshd[106053]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:57:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:57:59 localhost podman[106055]: 2026-02-23 08:57:59.928612145 +0000 UTC m=+0.097602145 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc.) Feb 23 03:57:59 localhost podman[106055]: 2026-02-23 08:57:59.941105172 +0000 UTC m=+0.110095152 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 03:57:59 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:58:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:58:07 localhost podman[106084]: 2026-02-23 08:58:07.951362475 +0000 UTC m=+0.108814622 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi) Feb 23 03:58:07 localhost podman[106076]: 2026-02-23 08:58:07.932826801 +0000 UTC m=+0.104117537 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64) Feb 23 03:58:07 localhost podman[106077]: 2026-02-23 08:58:07.995489223 +0000 UTC m=+0.161828735 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 03:58:08 localhost podman[106084]: 2026-02-23 08:58:08.004618515 +0000 UTC m=+0.162070652 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:58:08 localhost podman[106076]: 2026-02-23 08:58:08.011889551 +0000 UTC m=+0.183180267 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Feb 23 03:58:08 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:58:08 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:58:08 localhost podman[106090]: 2026-02-23 08:58:08.073230211 +0000 UTC m=+0.206275812 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, version=17.1.13, config_id=tripleo_step4) Feb 23 03:58:08 localhost podman[106078]: 2026-02-23 08:58:08.084073527 +0000 UTC m=+0.245045793 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:58:08 localhost podman[106090]: 2026-02-23 08:58:08.088192695 +0000 UTC m=+0.221238256 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:58:08 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:58:08 localhost podman[106078]: 2026-02-23 08:58:08.12935064 +0000 UTC m=+0.290322936 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.13, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:58:08 localhost podman[106078]: unhealthy Feb 23 03:58:08 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:58:08 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:58:08 localhost podman[106075]: 2026-02-23 08:58:08.146807411 +0000 UTC m=+0.320255244 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 23 03:58:08 localhost podman[106075]: 2026-02-23 08:58:08.189435672 +0000 UTC m=+0.362883505 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64) Feb 23 03:58:08 localhost podman[106075]: unhealthy Feb 23 03:58:08 localhost podman[106098]: 2026-02-23 08:58:08.19908506 +0000 UTC m=+0.342259905 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Feb 23 03:58:08 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:58:08 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:58:08 localhost podman[106096]: 2026-02-23 08:58:08.247821401 +0000 UTC m=+0.399889721 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 03:58:08 localhost podman[106077]: 2026-02-23 08:58:08.276747347 +0000 UTC m=+0.443086849 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:58:08 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:58:08 localhost podman[106096]: 2026-02-23 08:58:08.297351155 +0000 UTC m=+0.449419505 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true) Feb 23 03:58:08 localhost podman[106096]: unhealthy Feb 23 03:58:08 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:58:08 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:58:08 localhost podman[106098]: 2026-02-23 08:58:08.414426742 +0000 UTC m=+0.557601617 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, architecture=x86_64, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13) Feb 23 03:58:08 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:58:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:58:09 localhost podman[106256]: 2026-02-23 08:58:09.911529738 +0000 UTC m=+0.085566993 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4) Feb 23 03:58:10 localhost podman[106256]: 2026-02-23 08:58:10.329451026 +0000 UTC m=+0.503488221 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, tcib_managed=true) Feb 23 03:58:10 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:58:17 localhost sshd[106281]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:58:27 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:58:27 localhost recover_tripleo_nova_virtqemud[106360]: 61982 Feb 23 03:58:27 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:58:27 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:58:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62259 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD814550000000001030307) Feb 23 03:58:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:58:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62260 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD818470000000001030307) Feb 23 03:58:30 localhost systemd[1]: tmp-crun.tDBFJU.mount: Deactivated successfully. Feb 23 03:58:30 localhost podman[106361]: 2026-02-23 08:58:30.934922291 +0000 UTC m=+0.108981867 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5) Feb 23 03:58:30 localhost podman[106361]: 2026-02-23 08:58:30.94832204 +0000 UTC m=+0.122381606 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Feb 23 03:58:30 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:58:32 localhost sshd[106381]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:58:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62261 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD820460000000001030307) Feb 23 03:58:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=428 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8213B0000000001030307) Feb 23 03:58:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=429 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD825460000000001030307) Feb 23 03:58:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=430 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD82D460000000001030307) Feb 23 03:58:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62262 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD830070000000001030307) Feb 23 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:58:38 localhost systemd[1]: tmp-crun.6daTq0.mount: Deactivated successfully. Feb 23 03:58:38 localhost podman[106384]: 2026-02-23 08:58:38.965639034 +0000 UTC m=+0.129440785 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 23 03:58:39 localhost systemd[1]: tmp-crun.KAmkbo.mount: Deactivated successfully. Feb 23 03:58:39 localhost podman[106405]: 2026-02-23 08:58:39.011669492 +0000 UTC m=+0.150003957 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step5, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public) Feb 23 03:58:39 localhost podman[106383]: 2026-02-23 08:58:39.035948201 +0000 UTC m=+0.200005180 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, container_name=ovn_controller, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:58:39 localhost podman[106395]: 2026-02-23 08:58:39.052925932 +0000 UTC m=+0.191222066 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, architecture=x86_64, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team) Feb 23 03:58:39 localhost podman[106386]: 2026-02-23 08:58:39.096993869 +0000 UTC m=+0.249433115 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:58:39 localhost podman[106395]: 2026-02-23 08:58:39.09960138 +0000 UTC m=+0.237897484 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc.) Feb 23 03:58:39 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:58:39 localhost podman[106386]: 2026-02-23 08:58:39.114219367 +0000 UTC m=+0.266658623 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, version=17.1.13, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true) Feb 23 03:58:39 localhost podman[106386]: unhealthy Feb 23 03:58:39 localhost podman[106405]: 2026-02-23 08:58:39.125898912 +0000 UTC m=+0.264233397 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container) Feb 23 03:58:39 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:58:39 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:58:39 localhost podman[106405]: unhealthy Feb 23 03:58:39 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:58:39 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:58:39 localhost podman[106404]: 2026-02-23 08:58:39.103976116 +0000 UTC m=+0.247531055 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=logrotate_crond, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:58:39 localhost podman[106383]: 2026-02-23 08:58:39.169603388 +0000 UTC m=+0.333660427 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, architecture=x86_64, release=1766032510, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, version=17.1.13) Feb 23 03:58:39 localhost podman[106383]: unhealthy Feb 23 03:58:39 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:58:39 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:58:39 localhost podman[106385]: 2026-02-23 08:58:39.21447974 +0000 UTC m=+0.376299000 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2026-01-12T23:07:47Z) Feb 23 03:58:39 localhost podman[106384]: 2026-02-23 08:58:39.225414052 +0000 UTC m=+0.389215823 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, release=1766032510) Feb 23 03:58:39 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:58:39 localhost podman[106404]: 2026-02-23 08:58:39.235323721 +0000 UTC m=+0.378878630 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:58:39 localhost podman[106385]: 2026-02-23 08:58:39.25032811 +0000 UTC m=+0.412147380 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 03:58:39 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:58:39 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:58:39 localhost podman[106411]: 2026-02-23 08:58:39.309146628 +0000 UTC m=+0.442842878 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, release=1766032510, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr) Feb 23 03:58:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9319 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD839920000000001030307) Feb 23 03:58:39 localhost podman[106411]: 2026-02-23 08:58:39.529338588 +0000 UTC m=+0.663034858 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 03:58:39 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:58:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=431 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD83D060000000001030307) Feb 23 03:58:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9320 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD83D860000000001030307) Feb 23 03:58:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:58:40 localhost podman[106557]: 2026-02-23 08:58:40.913091515 +0000 UTC m=+0.083106768 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:58:41 localhost podman[106557]: 2026-02-23 08:58:41.330822398 +0000 UTC m=+0.500837661 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team) Feb 23 03:58:41 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:58:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9321 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD845860000000001030307) Feb 23 03:58:43 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2999 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD849300000000001030307) Feb 23 03:58:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3000 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD84D460000000001030307) Feb 23 03:58:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62263 DF PROTO=TCP SPT=41348 DPT=9100 SEQ=1915400411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD850060000000001030307) Feb 23 03:58:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9322 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD855470000000001030307) Feb 23 03:58:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3001 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD855470000000001030307) Feb 23 03:58:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=432 DF PROTO=TCP SPT=53014 DPT=9101 SEQ=3564720691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD85E060000000001030307) Feb 23 03:58:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3002 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD865060000000001030307) Feb 23 03:58:53 localhost sshd[106581]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:58:53 localhost systemd-logind[759]: New session 36 of user zuul. Feb 23 03:58:53 localhost systemd[1]: Started Session 36 of User zuul. Feb 23 03:58:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27699 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=3891827151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8730F0000000001030307) Feb 23 03:58:54 localhost python3.9[106676]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 03:58:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9323 DF PROTO=TCP SPT=43354 DPT=9882 SEQ=1288668895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD876060000000001030307) Feb 23 03:58:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27700 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=3891827151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD877060000000001030307) Feb 23 03:58:55 localhost python3.9[106770]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 03:58:56 localhost python3.9[106863]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 03:58:56 localhost python3.9[106957]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 03:58:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27701 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=3891827151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD87F060000000001030307) Feb 23 03:58:57 localhost python3.9[107050]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 03:58:58 localhost python3.9[107141]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Feb 23 03:58:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3003 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD886060000000001030307) Feb 23 03:58:59 localhost sshd[107156]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:58:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21908 DF PROTO=TCP SPT=35058 DPT=9100 SEQ=2710335491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD889850000000001030307) Feb 23 03:59:00 localhost python3.9[107233]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 03:59:00 localhost python3.9[107325]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Feb 23 03:59:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21909 DF PROTO=TCP SPT=35058 DPT=9100 SEQ=2710335491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD88D860000000001030307) Feb 23 03:59:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:59:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:59:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:59:01 localhost python3.9[107415]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 03:59:01 localhost systemd[1]: tmp-crun.xulUhf.mount: Deactivated successfully. Feb 23 03:59:01 localhost podman[107416]: 2026-02-23 08:59:01.937929132 +0000 UTC m=+0.109271096 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, architecture=x86_64, build-date=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=collectd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, vcs-type=git) Feb 23 03:59:01 localhost podman[107416]: 2026-02-23 08:59:01.953357974 +0000 UTC m=+0.124699918 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git) Feb 23 03:59:01 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:59:02 localhost python3.9[107483]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 03:59:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21910 DF PROTO=TCP SPT=35058 DPT=9100 SEQ=2710335491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD895860000000001030307) Feb 23 03:59:03 localhost systemd[1]: session-36.scope: Deactivated successfully. Feb 23 03:59:03 localhost systemd[1]: session-36.scope: Consumed 5.012s CPU time. Feb 23 03:59:03 localhost systemd-logind[759]: Session 36 logged out. Waiting for processes to exit. Feb 23 03:59:03 localhost systemd-logind[759]: Removed session 36. Feb 23 03:59:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 03:59:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 03:59:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2465 DF PROTO=TCP SPT=58868 DPT=9101 SEQ=2655477868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8A2860000000001030307) Feb 23 03:59:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20693 DF PROTO=TCP SPT=46774 DPT=9882 SEQ=522294137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8AEC30000000001030307) Feb 23 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:59:09 localhost podman[107501]: 2026-02-23 08:59:09.926308541 +0000 UTC m=+0.092877533 container health_status 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Feb 23 03:59:09 localhost systemd[1]: tmp-crun.rcgNOW.mount: Deactivated successfully. Feb 23 03:59:09 localhost podman[107528]: 2026-02-23 08:59:09.955140022 +0000 UTC m=+0.101449291 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd) Feb 23 03:59:09 localhost systemd[1]: tmp-crun.GeTEWM.mount: Deactivated successfully. Feb 23 03:59:09 localhost podman[107526]: 2026-02-23 08:59:09.998811317 +0000 UTC m=+0.142834664 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Feb 23 03:59:10 localhost podman[107499]: 2026-02-23 08:59:10.040035484 +0000 UTC m=+0.208419683 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:59:10 localhost podman[107502]: 2026-02-23 08:59:09.974958311 +0000 UTC m=+0.135558956 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, vcs-type=git, io.openshift.expose-services=) Feb 23 03:59:10 localhost podman[107500]: 2026-02-23 08:59:10.095286511 +0000 UTC m=+0.261645936 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1766032510, architecture=x86_64, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 03:59:10 localhost podman[107499]: 2026-02-23 08:59:10.107159342 +0000 UTC m=+0.275543601 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 03:59:10 localhost podman[107499]: unhealthy Feb 23 03:59:10 localhost podman[107519]: 2026-02-23 08:59:10.11602495 +0000 UTC m=+0.266242741 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com) Feb 23 03:59:10 localhost podman[107526]: 2026-02-23 08:59:10.11798457 +0000 UTC m=+0.262007997 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:59:10 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:59:10 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:59:10 localhost podman[107526]: unhealthy Feb 23 03:59:10 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:59:10 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:59:10 localhost podman[107528]: 2026-02-23 08:59:10.146230773 +0000 UTC m=+0.292540072 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z) Feb 23 03:59:10 localhost podman[107500]: 2026-02-23 08:59:10.160612722 +0000 UTC m=+0.326972197 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, distribution-scope=public, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp-rhel9/openstack-iscsid) Feb 23 03:59:10 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:59:10 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:59:10 localhost podman[107519]: 2026-02-23 08:59:10.206472925 +0000 UTC m=+0.356690757 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:59:10 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:59:10 localhost podman[107508]: 2026-02-23 08:59:10.255066234 +0000 UTC m=+0.407101502 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z) Feb 23 03:59:10 localhost podman[107502]: 2026-02-23 08:59:10.263200538 +0000 UTC m=+0.423801213 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 03:59:10 localhost podman[107502]: unhealthy Feb 23 03:59:10 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:59:10 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:59:10 localhost podman[107501]: 2026-02-23 08:59:10.3154287 +0000 UTC m=+0.481997832 container exec_died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, distribution-scope=public, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com) Feb 23 03:59:10 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Deactivated successfully. Feb 23 03:59:10 localhost podman[107508]: 2026-02-23 08:59:10.365927728 +0000 UTC m=+0.517963036 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 03:59:10 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:59:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:59:11 localhost podman[107683]: 2026-02-23 08:59:11.933258412 +0000 UTC m=+0.109095170 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, version=17.1.13) Feb 23 03:59:12 localhost podman[107683]: 2026-02-23 08:59:12.385797632 +0000 UTC m=+0.561634280 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com) Feb 23 03:59:12 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:59:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20695 DF PROTO=TCP SPT=46774 DPT=9882 SEQ=522294137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8BAC60000000001030307) Feb 23 03:59:12 localhost sshd[107707]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:59:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3004 DF PROTO=TCP SPT=50600 DPT=9105 SEQ=646796191 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8C6060000000001030307) Feb 23 03:59:16 localhost sshd[107709]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:59:17 localhost systemd-logind[759]: New session 37 of user zuul. Feb 23 03:59:17 localhost systemd[1]: Started Session 37 of User zuul. Feb 23 03:59:18 localhost python3.9[107804]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 03:59:18 localhost systemd[1]: Reloading. Feb 23 03:59:18 localhost systemd-rc-local-generator[107827]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:59:18 localhost systemd-sysv-generator[107833]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:59:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:59:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2467 DF PROTO=TCP SPT=58868 DPT=9101 SEQ=2655477868 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8D2060000000001030307) Feb 23 03:59:19 localhost python3.9[107929]: ansible-ansible.builtin.service_facts Invoked Feb 23 03:59:19 localhost network[107946]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 03:59:19 localhost network[107947]: 'network-scripts' will be removed from distribution in near future. Feb 23 03:59:19 localhost network[107948]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 03:59:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:59:23 localhost python3.9[108221]: ansible-ansible.builtin.service_facts Invoked Feb 23 03:59:23 localhost network[108238]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 03:59:23 localhost network[108239]: 'network-scripts' will be removed from distribution in near future. Feb 23 03:59:23 localhost network[108240]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 03:59:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57791 DF PROTO=TCP SPT=39384 DPT=9102 SEQ=2834932690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8E8400000000001030307) Feb 23 03:59:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20697 DF PROTO=TCP SPT=46774 DPT=9882 SEQ=522294137 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8EA070000000001030307) Feb 23 03:59:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:59:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57793 DF PROTO=TCP SPT=39384 DPT=9102 SEQ=2834932690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD8F4460000000001030307) Feb 23 03:59:29 localhost python3.9[108439]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 03:59:29 localhost systemd[1]: Reloading. Feb 23 03:59:29 localhost systemd-sysv-generator[108468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 03:59:29 localhost systemd-rc-local-generator[108465]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 03:59:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 03:59:30 localhost systemd[1]: Stopping ceilometer_agent_compute container... Feb 23 03:59:30 localhost systemd[1]: tmp-crun.9smelV.mount: Deactivated successfully. Feb 23 03:59:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54041 DF PROTO=TCP SPT=51734 DPT=9100 SEQ=1973825206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD902C60000000001030307) Feb 23 03:59:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 03:59:32 localhost systemd[1]: tmp-crun.XhJ3kl.mount: Deactivated successfully. Feb 23 03:59:32 localhost podman[108494]: 2026-02-23 08:59:32.17192037 +0000 UTC m=+0.093625837 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container) Feb 23 03:59:32 localhost podman[108494]: 2026-02-23 08:59:32.213647634 +0000 UTC m=+0.135353061 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 03:59:32 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 03:59:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54042 DF PROTO=TCP SPT=51734 DPT=9100 SEQ=1973825206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD90AC60000000001030307) Feb 23 03:59:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62890 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=566802748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD917860000000001030307) Feb 23 03:59:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16383 DF PROTO=TCP SPT=39748 DPT=9882 SEQ=4206382286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD923F30000000001030307) Feb 23 03:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 03:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 03:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 03:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 03:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 03:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 03:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 03:59:40 localhost podman[108545]: Error: container 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 is not running Feb 23 03:59:40 localhost podman[108516]: 2026-02-23 08:59:40.423820355 +0000 UTC m=+0.093608905 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 03:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 03:59:40 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Main process exited, code=exited, status=125/n/a Feb 23 03:59:40 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed with result 'exit-code'. Feb 23 03:59:40 localhost systemd[1]: tmp-crun.avrdYm.mount: Deactivated successfully. Feb 23 03:59:40 localhost podman[108521]: 2026-02-23 08:59:40.475302085 +0000 UTC m=+0.135633810 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:59:40 localhost podman[108516]: 2026-02-23 08:59:40.490588312 +0000 UTC m=+0.160376862 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com) Feb 23 03:59:40 localhost podman[108516]: unhealthy Feb 23 03:59:40 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:59:40 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 03:59:40 localhost podman[108530]: 2026-02-23 08:59:40.544967722 +0000 UTC m=+0.197352199 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Feb 23 03:59:40 localhost podman[108525]: 2026-02-23 08:59:40.502701191 +0000 UTC m=+0.154293853 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=nova_compute, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 03:59:40 localhost podman[108525]: 2026-02-23 08:59:40.585197918 +0000 UTC m=+0.236790540 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.openshift.expose-services=, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Feb 23 03:59:40 localhost podman[108525]: unhealthy Feb 23 03:59:40 localhost podman[108518]: 2026-02-23 08:59:40.595955794 +0000 UTC m=+0.256261768 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 03:59:40 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:59:40 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 03:59:40 localhost podman[108521]: 2026-02-23 08:59:40.605853073 +0000 UTC m=+0.266184818 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, container_name=logrotate_crond, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 03:59:40 localhost podman[108518]: 2026-02-23 08:59:40.613225494 +0000 UTC m=+0.273531398 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com) Feb 23 03:59:40 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 03:59:40 localhost podman[108518]: unhealthy Feb 23 03:59:40 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 03:59:40 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 03:59:40 localhost podman[108593]: 2026-02-23 08:59:40.563342225 +0000 UTC m=+0.123743937 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.buildah.version=1.41.5) Feb 23 03:59:40 localhost podman[108517]: 2026-02-23 08:59:40.673405294 +0000 UTC m=+0.338758465 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Feb 23 03:59:40 localhost podman[108517]: 2026-02-23 08:59:40.68736869 +0000 UTC m=+0.352721951 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 03:59:40 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 03:59:40 localhost podman[108593]: 2026-02-23 08:59:40.742902046 +0000 UTC m=+0.303303838 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:59:40 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 03:59:40 localhost podman[108530]: 2026-02-23 08:59:40.780174901 +0000 UTC m=+0.432559388 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, build-date=2026-01-12T22:10:14Z, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_id=tripleo_step1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:59:40 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 03:59:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16385 DF PROTO=TCP SPT=39748 DPT=9882 SEQ=4206382286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD930060000000001030307) Feb 23 03:59:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 03:59:42 localhost podman[108682]: 2026-02-23 08:59:42.665071347 +0000 UTC m=+0.089810717 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5) Feb 23 03:59:43 localhost podman[108682]: 2026-02-23 08:59:43.078533036 +0000 UTC m=+0.503272366 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 03:59:43 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 03:59:43 localhost sshd[108705]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:59:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18578 DF PROTO=TCP SPT=44066 DPT=9105 SEQ=27882988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD93A060000000001030307) Feb 23 03:59:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62892 DF PROTO=TCP SPT=41750 DPT=9101 SEQ=566802748 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD948070000000001030307) Feb 23 03:59:53 localhost sshd[108707]: main: sshd: ssh-rsa algorithm is disabled Feb 23 03:59:53 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 03:59:53 localhost recover_tripleo_nova_virtqemud[108710]: 61982 Feb 23 03:59:53 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 03:59:53 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 03:59:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31920 DF PROTO=TCP SPT=38510 DPT=9102 SEQ=3709768302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD95D700000000001030307) Feb 23 03:59:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16387 DF PROTO=TCP SPT=39748 DPT=9882 SEQ=4206382286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD960070000000001030307) Feb 23 03:59:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31922 DF PROTO=TCP SPT=38510 DPT=9102 SEQ=3709768302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD969870000000001030307) Feb 23 04:00:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14393 DF PROTO=TCP SPT=53710 DPT=9100 SEQ=26865489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD978060000000001030307) Feb 23 04:00:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 04:00:02 localhost podman[108715]: 2026-02-23 09:00:02.667557266 +0000 UTC m=+0.090628224 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd) Feb 23 04:00:02 localhost podman[108715]: 2026-02-23 09:00:02.707334648 +0000 UTC m=+0.130405556 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Feb 23 04:00:02 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 04:00:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14394 DF PROTO=TCP SPT=53710 DPT=9100 SEQ=26865489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD980060000000001030307) Feb 23 04:00:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29966 DF PROTO=TCP SPT=54134 DPT=9101 SEQ=47187795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD98CC70000000001030307) Feb 23 04:00:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5816 DF PROTO=TCP SPT=50952 DPT=9882 SEQ=2048617207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD999230000000001030307) Feb 23 04:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 04:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 04:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 04:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 04:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 04:00:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 04:00:10 localhost podman[108737]: 2026-02-23 09:00:10.937949098 +0000 UTC m=+0.105589901 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 04:00:10 localhost podman[108738]: 2026-02-23 09:00:10.948105285 +0000 UTC m=+0.111448793 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.buildah.version=1.41.5, container_name=iscsid, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public) Feb 23 04:00:11 localhost podman[108739]: Error: container 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 is not running Feb 23 04:00:11 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Main process exited, code=exited, status=125/n/a Feb 23 04:00:11 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed with result 'exit-code'. Feb 23 04:00:11 localhost podman[108758]: 2026-02-23 09:00:10.995273289 +0000 UTC m=+0.136923930 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:00:11 localhost podman[108763]: 2026-02-23 09:00:11.064535593 +0000 UTC m=+0.205756140 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64) Feb 23 04:00:11 localhost podman[108740]: 2026-02-23 09:00:11.000155141 +0000 UTC m=+0.155355634 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=) Feb 23 04:00:11 localhost podman[108752]: 2026-02-23 09:00:11.121539144 +0000 UTC m=+0.267625623 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:00:11 localhost podman[108752]: 2026-02-23 09:00:11.131354991 +0000 UTC m=+0.277441490 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron) Feb 23 04:00:11 localhost podman[108738]: 2026-02-23 09:00:11.138492623 +0000 UTC m=+0.301836112 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=iscsid, vcs-type=git, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 04:00:11 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 04:00:11 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 04:00:11 localhost podman[108758]: 2026-02-23 09:00:11.182022864 +0000 UTC m=+0.323673515 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510) Feb 23 04:00:11 localhost podman[108758]: unhealthy Feb 23 04:00:11 localhost podman[108740]: 2026-02-23 09:00:11.191111618 +0000 UTC m=+0.346312171 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public) Feb 23 04:00:11 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:00:11 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 04:00:11 localhost podman[108740]: unhealthy Feb 23 04:00:11 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:00:11 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:00:11 localhost podman[108737]: 2026-02-23 09:00:11.235795084 +0000 UTC m=+0.403435887 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, container_name=ovn_controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible) Feb 23 04:00:11 localhost podman[108737]: unhealthy Feb 23 04:00:11 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:00:11 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:00:11 localhost podman[108763]: 2026-02-23 09:00:11.273179723 +0000 UTC m=+0.414400290 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64) Feb 23 04:00:11 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 04:00:11 localhost podman[108746]: 2026-02-23 09:00:11.025802593 +0000 UTC m=+0.166459132 container health_status 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 04:00:11 localhost podman[108746]: 2026-02-23 09:00:11.32016466 +0000 UTC m=+0.460821199 container exec_died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, release=1766032510, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:00:11 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Deactivated successfully. Feb 23 04:00:11 localhost systemd[1]: tmp-crun.N5WIVy.mount: Deactivated successfully. Feb 23 04:00:12 localhost podman[108480]: time="2026-02-23T09:00:12Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Feb 23 04:00:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52448 SEQ=3894642472 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 23 04:00:12 localhost systemd[1]: libpod-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.scope: Deactivated successfully. Feb 23 04:00:12 localhost systemd[1]: libpod-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.scope: Consumed 6.775s CPU time. Feb 23 04:00:12 localhost podman[108480]: 2026-02-23 09:00:12.135788725 +0000 UTC m=+42.109811059 container died 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 04:00:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: Deactivated successfully. Feb 23 04:00:12 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9. Feb 23 04:00:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: No such file or directory Feb 23 04:00:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9-userdata-shm.mount: Deactivated successfully. Feb 23 04:00:12 localhost podman[108480]: 2026-02-23 09:00:12.194740717 +0000 UTC m=+42.168763051 container cleanup 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Feb 23 04:00:12 localhost podman[108480]: ceilometer_agent_compute Feb 23 04:00:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: No such file or directory Feb 23 04:00:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: No such file or directory Feb 23 04:00:12 localhost podman[108897]: 2026-02-23 09:00:12.222131754 +0000 UTC m=+0.078957908 container cleanup 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 04:00:12 localhost systemd[1]: libpod-conmon-68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.scope: Deactivated successfully. Feb 23 04:00:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.timer: No such file or directory Feb 23 04:00:12 localhost systemd[1]: 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: Failed to open /run/systemd/transient/68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9.service: No such file or directory Feb 23 04:00:12 localhost podman[108909]: 2026-02-23 09:00:12.330109388 +0000 UTC m=+0.073720715 container cleanup 68bca18bd55a4f6164da96c9ab36a65957e1d6d1345b4b77b36731a975c3a9b9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 04:00:12 localhost podman[108909]: ceilometer_agent_compute Feb 23 04:00:12 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Feb 23 04:00:12 localhost systemd[1]: Stopped ceilometer_agent_compute container. Feb 23 04:00:12 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.112s CPU time, no IO. Feb 23 04:00:12 localhost systemd[1]: var-lib-containers-storage-overlay-85b58d6db47c08b0dc415e7676af05b270f66bddea6d8ca4f2d3998d7b04080d-merged.mount: Deactivated successfully. Feb 23 04:00:13 localhost python3.9[109013]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:00:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 04:00:13 localhost systemd[1]: Reloading. Feb 23 04:00:13 localhost podman[109015]: 2026-02-23 09:00:13.26594923 +0000 UTC m=+0.070052170 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com) Feb 23 04:00:13 localhost systemd-rc-local-generator[109064]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:00:13 localhost systemd-sysv-generator[109067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:00:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:00:13 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Feb 23 04:00:13 localhost podman[109015]: 2026-02-23 09:00:13.696586076 +0000 UTC m=+0.500688986 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true) Feb 23 04:00:13 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 04:00:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14396 DF PROTO=TCP SPT=53710 DPT=9100 SEQ=26865489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9B0060000000001030307) Feb 23 04:00:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29968 DF PROTO=TCP SPT=54134 DPT=9101 SEQ=47187795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9BC070000000001030307) Feb 23 04:00:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45875 DF PROTO=TCP SPT=33512 DPT=9102 SEQ=461622767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9D2A00000000001030307) Feb 23 04:00:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5820 DF PROTO=TCP SPT=50952 DPT=9882 SEQ=2048617207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9D6070000000001030307) Feb 23 04:00:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52448 SEQ=3894642472 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 23 04:00:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25464 DF PROTO=TCP SPT=55642 DPT=9100 SEQ=2139831579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9ED070000000001030307) Feb 23 04:00:32 localhost sshd[109168]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:00:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 04:00:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25465 DF PROTO=TCP SPT=55642 DPT=9100 SEQ=2139831579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BD9F5060000000001030307) Feb 23 04:00:32 localhost systemd[1]: tmp-crun.fq3DkZ.mount: Deactivated successfully. Feb 23 04:00:32 localhost podman[109170]: 2026-02-23 09:00:32.921328364 +0000 UTC m=+0.102323605 container health_status 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z) Feb 23 04:00:32 localhost podman[109170]: 2026-02-23 09:00:32.934532975 +0000 UTC m=+0.115528236 container exec_died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3) Feb 23 04:00:32 localhost sshd[109187]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:00:32 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Deactivated successfully. Feb 23 04:00:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39575 DF PROTO=TCP SPT=53718 DPT=9101 SEQ=1764346632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA02070000000001030307) Feb 23 04:00:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45879 DF PROTO=TCP SPT=33512 DPT=9102 SEQ=461622767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA0E060000000001030307) Feb 23 04:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 04:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 04:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 04:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 04:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 04:00:41 localhost podman[109194]: Error: container 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 is not running Feb 23 04:00:41 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Main process exited, code=exited, status=125/n/a Feb 23 04:00:41 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed with result 'exit-code'. Feb 23 04:00:41 localhost systemd[1]: tmp-crun.UHE6cs.mount: Deactivated successfully. Feb 23 04:00:41 localhost podman[109192]: 2026-02-23 09:00:41.697086221 +0000 UTC m=+0.105676839 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team) Feb 23 04:00:41 localhost podman[109192]: 2026-02-23 09:00:41.736325452 +0000 UTC m=+0.144916100 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team) Feb 23 04:00:41 localhost systemd[1]: tmp-crun.PUvHOt.mount: Deactivated successfully. Feb 23 04:00:41 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 04:00:41 localhost podman[109191]: 2026-02-23 09:00:41.785581354 +0000 UTC m=+0.195568335 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64) Feb 23 04:00:41 localhost podman[109195]: 2026-02-23 09:00:41.741808772 +0000 UTC m=+0.137340594 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com) Feb 23 04:00:41 localhost podman[109193]: 2026-02-23 09:00:41.806376232 +0000 UTC m=+0.201428529 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, version=17.1.13, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc.) Feb 23 04:00:41 localhost podman[109193]: 2026-02-23 09:00:41.847969596 +0000 UTC m=+0.243021893 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true) Feb 23 04:00:41 localhost podman[109193]: unhealthy Feb 23 04:00:41 localhost podman[109212]: 2026-02-23 09:00:41.859096512 +0000 UTC m=+0.250930249 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public) Feb 23 04:00:41 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:00:41 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:00:41 localhost podman[109191]: 2026-02-23 09:00:41.86835578 +0000 UTC m=+0.278342751 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64) Feb 23 04:00:41 localhost podman[109191]: unhealthy Feb 23 04:00:41 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:00:41 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:00:41 localhost podman[109207]: 2026-02-23 09:00:41.912680979 +0000 UTC m=+0.298433807 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 04:00:41 localhost podman[109195]: 2026-02-23 09:00:41.930120692 +0000 UTC m=+0.325652544 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, url=https://www.redhat.com, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron) Feb 23 04:00:41 localhost podman[109207]: 2026-02-23 09:00:41.93745078 +0000 UTC m=+0.323203648 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step5, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 04:00:41 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 04:00:41 localhost podman[109207]: unhealthy Feb 23 04:00:41 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:00:41 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 04:00:42 localhost podman[109212]: 2026-02-23 09:00:42.118494713 +0000 UTC m=+0.510328430 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Feb 23 04:00:42 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 04:00:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36339 DF PROTO=TCP SPT=49580 DPT=9882 SEQ=53511426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA1A460000000001030307) Feb 23 04:00:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 04:00:43 localhost systemd[1]: tmp-crun.VlgfwV.mount: Deactivated successfully. Feb 23 04:00:43 localhost podman[109326]: 2026-02-23 09:00:43.916860859 +0000 UTC m=+0.093885162 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com) Feb 23 04:00:44 localhost podman[109326]: 2026-02-23 09:00:44.306598735 +0000 UTC m=+0.483622998 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 04:00:44 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 04:00:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59359 DF PROTO=TCP SPT=35460 DPT=9105 SEQ=2680018870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA24060000000001030307) Feb 23 04:00:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39577 DF PROTO=TCP SPT=53718 DPT=9101 SEQ=1764346632 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA32060000000001030307) Feb 23 04:00:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11700 DF PROTO=TCP SPT=32960 DPT=9102 SEQ=195551821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA47D00000000001030307) Feb 23 04:00:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36341 DF PROTO=TCP SPT=49580 DPT=9882 SEQ=53511426 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA4A070000000001030307) Feb 23 04:00:55 localhost podman[109076]: time="2026-02-23T09:00:55Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Feb 23 04:00:55 localhost systemd[1]: tmp-crun.bEolY8.mount: Deactivated successfully. Feb 23 04:00:55 localhost systemd[1]: libpod-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.scope: Deactivated successfully. Feb 23 04:00:55 localhost systemd[1]: libpod-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.scope: Consumed 7.261s CPU time. Feb 23 04:00:55 localhost podman[109076]: 2026-02-23 09:00:55.789167514 +0000 UTC m=+42.111801170 container died 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 04:00:55 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: Deactivated successfully. Feb 23 04:00:55 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950. Feb 23 04:00:55 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: No such file or directory Feb 23 04:00:55 localhost podman[109076]: 2026-02-23 09:00:55.846430526 +0000 UTC m=+42.169064182 container cleanup 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Feb 23 04:00:55 localhost podman[109076]: ceilometer_agent_ipmi Feb 23 04:00:55 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: No such file or directory Feb 23 04:00:55 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: No such file or directory Feb 23 04:00:55 localhost podman[109350]: 2026-02-23 09:00:55.890775575 +0000 UTC m=+0.082811687 container cleanup 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06) Feb 23 04:00:55 localhost systemd[1]: libpod-conmon-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.scope: Deactivated successfully. Feb 23 04:00:55 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.timer: No such file or directory Feb 23 04:00:56 localhost systemd[1]: 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: Failed to open /run/systemd/transient/9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950.service: No such file or directory Feb 23 04:00:56 localhost podman[109364]: 2026-02-23 09:00:56.001056917 +0000 UTC m=+0.076627186 container cleanup 9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '44281c742f88411d75916a4e58499720'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 04:00:56 localhost podman[109364]: ceilometer_agent_ipmi Feb 23 04:00:56 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Feb 23 04:00:56 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Feb 23 04:00:56 localhost systemd[1]: var-lib-containers-storage-overlay-51915910ced93426f00f1704499e6c4900ce6f68bf275b1a1584b9abaa73dcbc-merged.mount: Deactivated successfully. Feb 23 04:00:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ee3aac6247d0954a57a271eeef5c9992c0afd6e065635f640c1c3948667f950-userdata-shm.mount: Deactivated successfully. Feb 23 04:00:56 localhost python3.9[109466]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:00:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11702 DF PROTO=TCP SPT=32960 DPT=9102 SEQ=195551821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA53C70000000001030307) Feb 23 04:00:57 localhost systemd[1]: Reloading. Feb 23 04:00:58 localhost systemd-rc-local-generator[109489]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:00:58 localhost systemd-sysv-generator[109492]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:00:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:00:58 localhost systemd[1]: Stopping collectd container... Feb 23 04:01:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58829 DF PROTO=TCP SPT=59464 DPT=9100 SEQ=1379809827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA62460000000001030307) Feb 23 04:01:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58830 DF PROTO=TCP SPT=59464 DPT=9100 SEQ=1379809827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA6A460000000001030307) Feb 23 04:01:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 04:01:03 localhost podman[109546]: Error: container 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 is not running Feb 23 04:01:03 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Main process exited, code=exited, status=125/n/a Feb 23 04:01:03 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed with result 'exit-code'. Feb 23 04:01:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30645 DF PROTO=TCP SPT=36270 DPT=9101 SEQ=807460934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA77460000000001030307) Feb 23 04:01:07 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 04:01:07 localhost recover_tripleo_nova_virtqemud[109559]: 61982 Feb 23 04:01:07 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 04:01:07 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 04:01:08 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52466 SEQ=3556684315 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 23 04:01:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 04:01:11 localhost systemd[1]: tmp-crun.d6dPC1.mount: Deactivated successfully. Feb 23 04:01:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:01:11 localhost podman[109560]: 2026-02-23 09:01:11.937992683 +0000 UTC m=+0.108506968 container health_status 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 04:01:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:01:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 04:01:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 04:01:11 localhost podman[109560]: 2026-02-23 09:01:11.983403015 +0000 UTC m=+0.153917240 container exec_died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13) Feb 23 04:01:12 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Deactivated successfully. Feb 23 04:01:12 localhost podman[109580]: 2026-02-23 09:01:12.039380317 +0000 UTC m=+0.085694538 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, version=17.1.13, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Feb 23 04:01:12 localhost podman[109580]: 2026-02-23 09:01:12.087595827 +0000 UTC m=+0.133910038 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, container_name=ovn_metadata_agent, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn) Feb 23 04:01:12 localhost podman[109580]: unhealthy Feb 23 04:01:12 localhost podman[109579]: 2026-02-23 09:01:12.096406281 +0000 UTC m=+0.146339694 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13) Feb 23 04:01:12 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:01:12 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:01:12 localhost podman[109579]: 2026-02-23 09:01:12.114445293 +0000 UTC m=+0.164378656 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13) Feb 23 04:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 04:01:12 localhost podman[109587]: 2026-02-23 09:01:12.167038559 +0000 UTC m=+0.205073082 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, release=1766032510) Feb 23 04:01:12 localhost podman[109579]: unhealthy Feb 23 04:01:12 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:01:12 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:01:12 localhost podman[109587]: 2026-02-23 09:01:12.254452369 +0000 UTC m=+0.292486852 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z) Feb 23 04:01:12 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 04:01:12 localhost podman[109593]: 2026-02-23 09:01:12.304249008 +0000 UTC m=+0.335580912 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1) Feb 23 04:01:12 localhost podman[109593]: 2026-02-23 09:01:12.327308326 +0000 UTC m=+0.358640240 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 04:01:12 localhost sshd[109686]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:01:12 localhost podman[109593]: unhealthy Feb 23 04:01:12 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:01:12 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 04:01:12 localhost podman[109647]: 2026-02-23 09:01:12.255743219 +0000 UTC m=+0.078028509 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, version=17.1.13, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step1) Feb 23 04:01:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33856 DF PROTO=TCP SPT=34684 DPT=9882 SEQ=912446976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA8F860000000001030307) Feb 23 04:01:12 localhost podman[109647]: 2026-02-23 09:01:12.52347397 +0000 UTC m=+0.345759270 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, container_name=metrics_qdr) Feb 23 04:01:12 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 04:01:12 localhost sshd[109688]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:01:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 04:01:14 localhost podman[109690]: 2026-02-23 09:01:14.920061509 +0000 UTC m=+0.087289707 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com) Feb 23 04:01:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53492 DF PROTO=TCP SPT=44140 DPT=9105 SEQ=3525360412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDA9A060000000001030307) Feb 23 04:01:15 localhost podman[109690]: 2026-02-23 09:01:15.341343938 +0000 UTC m=+0.508572156 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 04:01:15 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 04:01:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30647 DF PROTO=TCP SPT=36270 DPT=9101 SEQ=807460934 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAA8060000000001030307) Feb 23 04:01:22 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52466 SEQ=3556684315 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 23 04:01:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2242 DF PROTO=TCP SPT=49472 DPT=9102 SEQ=2622482458 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDABD000000000001030307) Feb 23 04:01:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2244 DF PROTO=TCP SPT=49472 DPT=9102 SEQ=2622482458 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAC9060000000001030307) Feb 23 04:01:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56182 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=2807125067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAD7870000000001030307) Feb 23 04:01:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56183 DF PROTO=TCP SPT=41342 DPT=9100 SEQ=2807125067 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDADF860000000001030307) Feb 23 04:01:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 04:01:33 localhost podman[109840]: Error: container 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 is not running Feb 23 04:01:33 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Main process exited, code=exited, status=125/n/a Feb 23 04:01:33 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed with result 'exit-code'. Feb 23 04:01:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3111 DF PROTO=TCP SPT=52292 DPT=9101 SEQ=322994156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAEC460000000001030307) Feb 23 04:01:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1999 DF PROTO=TCP SPT=33644 DPT=9882 SEQ=191684158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDAF8B30000000001030307) Feb 23 04:01:40 localhost podman[109506]: time="2026-02-23T09:01:40Z" level=warning msg="StopSignal SIGTERM failed to stop container collectd in 42 seconds, resorting to SIGKILL" Feb 23 04:01:40 localhost podman[109506]: 2026-02-23 09:01:40.342344401 +0000 UTC m=+42.068250800 container stop 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, container_name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd) Feb 23 04:01:40 localhost systemd[1]: libpod-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.scope: Deactivated successfully. Feb 23 04:01:40 localhost systemd[1]: libpod-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.scope: Consumed 2.685s CPU time. Feb 23 04:01:40 localhost podman[109506]: 2026-02-23 09:01:40.374199011 +0000 UTC m=+42.100105380 container died 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:10:15Z, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Feb 23 04:01:40 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: Deactivated successfully. Feb 23 04:01:40 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759. Feb 23 04:01:40 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: No such file or directory Feb 23 04:01:40 localhost podman[109506]: 2026-02-23 09:01:40.480799629 +0000 UTC m=+42.206705978 container cleanup 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, distribution-scope=public, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 04:01:40 localhost podman[109506]: collectd Feb 23 04:01:40 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: No such file or directory Feb 23 04:01:40 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: No such file or directory Feb 23 04:01:40 localhost podman[109853]: 2026-02-23 09:01:40.513359602 +0000 UTC m=+0.153980613 container cleanup 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.) Feb 23 04:01:40 localhost systemd[1]: libpod-conmon-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.scope: Deactivated successfully. Feb 23 04:01:40 localhost podman[109884]: error opening file `/run/crun/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759/status`: No such file or directory Feb 23 04:01:40 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.timer: No such file or directory Feb 23 04:01:40 localhost systemd[1]: 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: Failed to open /run/systemd/transient/186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759.service: No such file or directory Feb 23 04:01:40 localhost podman[109871]: 2026-02-23 09:01:40.630692342 +0000 UTC m=+0.080443504 container cleanup 186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true) Feb 23 04:01:40 localhost podman[109871]: collectd Feb 23 04:01:40 localhost systemd[1]: tripleo_collectd.service: Deactivated successfully. Feb 23 04:01:40 localhost systemd[1]: Stopped collectd container. Feb 23 04:01:41 localhost systemd[1]: var-lib-containers-storage-overlay-70b7b3f393818c1da1a59fd13309ca9cf26b2dd139b3696bb046bf52c3291b46-merged.mount: Deactivated successfully. Feb 23 04:01:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-186ca60db4c7ee39946f50168badf01085abd90437978f2016d09882612c6759-userdata-shm.mount: Deactivated successfully. Feb 23 04:01:41 localhost python3.9[109977]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:01:41 localhost systemd[1]: Reloading. Feb 23 04:01:41 localhost systemd-sysv-generator[110009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:01:41 localhost systemd-rc-local-generator[110006]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:01:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:01:41 localhost systemd[1]: Stopping iscsid container... Feb 23 04:01:41 localhost systemd[1]: libpod-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.scope: Deactivated successfully. Feb 23 04:01:41 localhost systemd[1]: libpod-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.scope: Consumed 1.258s CPU time. Feb 23 04:01:41 localhost podman[110017]: 2026-02-23 09:01:41.945646637 +0000 UTC m=+0.083385475 container died 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Feb 23 04:01:41 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: Deactivated successfully. Feb 23 04:01:41 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f. Feb 23 04:01:41 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: No such file or directory Feb 23 04:01:41 localhost systemd[1]: tmp-crun.AIUzzI.mount: Deactivated successfully. Feb 23 04:01:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f-userdata-shm.mount: Deactivated successfully. Feb 23 04:01:41 localhost podman[110017]: 2026-02-23 09:01:41.997511821 +0000 UTC m=+0.135250639 container cleanup 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Feb 23 04:01:41 localhost podman[110017]: iscsid Feb 23 04:01:42 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: No such file or directory Feb 23 04:01:42 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: No such file or directory Feb 23 04:01:42 localhost podman[110031]: 2026-02-23 09:01:42.04248917 +0000 UTC m=+0.081099804 container cleanup 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public) Feb 23 04:01:42 localhost systemd[1]: libpod-conmon-40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.scope: Deactivated successfully. Feb 23 04:01:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:01:42 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.timer: No such file or directory Feb 23 04:01:42 localhost systemd[1]: 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: Failed to open /run/systemd/transient/40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f.service: No such file or directory Feb 23 04:01:42 localhost podman[110046]: 2026-02-23 09:01:42.157158708 +0000 UTC m=+0.081601119 container cleanup 40b94ceafed73b16a2843e994b0c7d1ccf4682b4d1fab8a6230d76c07848da9f (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:01:42 localhost podman[110046]: iscsid Feb 23 04:01:42 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Feb 23 04:01:42 localhost systemd[1]: Stopped iscsid container. Feb 23 04:01:42 localhost podman[110058]: 2026-02-23 09:01:42.249843212 +0000 UTC m=+0.091991993 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, distribution-scope=public, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:01:42 localhost podman[110058]: 2026-02-23 09:01:42.265954264 +0000 UTC m=+0.108103045 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com) Feb 23 04:01:42 localhost podman[110058]: unhealthy Feb 23 04:01:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:01:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 04:01:42 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:01:42 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:01:42 localhost systemd[1]: var-lib-containers-storage-overlay-52b3b14c7b87d61fbd3bfa894ff158a1c8322ab7dde44afc684a91162f67f067-merged.mount: Deactivated successfully. Feb 23 04:01:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 04:01:42 localhost podman[110094]: 2026-02-23 09:01:42.463905052 +0000 UTC m=+0.165710017 container health_status b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=logrotate_crond) Feb 23 04:01:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2001 DF PROTO=TCP SPT=33644 DPT=9882 SEQ=191684158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB04C60000000001030307) Feb 23 04:01:42 localhost podman[110092]: 2026-02-23 09:01:42.412649358 +0000 UTC m=+0.117856658 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-type=git, config_id=tripleo_step4, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=ovn_controller, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public) Feb 23 04:01:42 localhost podman[110156]: 2026-02-23 09:01:42.532775976 +0000 UTC m=+0.111584524 container health_status c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13) Feb 23 04:01:42 localhost podman[110092]: 2026-02-23 09:01:42.546361558 +0000 UTC m=+0.251568828 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5) Feb 23 04:01:42 localhost podman[110092]: unhealthy Feb 23 04:01:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 04:01:42 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:01:42 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:01:42 localhost podman[110156]: 2026-02-23 09:01:42.585397983 +0000 UTC m=+0.164206581 container exec_died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:01:42 localhost podman[110156]: unhealthy Feb 23 04:01:42 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:01:42 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 04:01:42 localhost podman[110094]: 2026-02-23 09:01:42.604171267 +0000 UTC m=+0.305976212 container exec_died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, container_name=logrotate_crond, vcs-type=git) Feb 23 04:01:42 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Deactivated successfully. Feb 23 04:01:42 localhost podman[110199]: 2026-02-23 09:01:42.688300345 +0000 UTC m=+0.106703892 container health_status f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, tcib_managed=true, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Feb 23 04:01:42 localhost podman[110199]: 2026-02-23 09:01:42.892545949 +0000 UTC m=+0.310949456 container exec_died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:10:14Z, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 04:01:42 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Deactivated successfully. Feb 23 04:01:43 localhost python3.9[110260]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:01:43 localhost systemd[1]: Reloading. Feb 23 04:01:43 localhost systemd-sysv-generator[110292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:01:43 localhost systemd-rc-local-generator[110284]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:01:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:01:43 localhost systemd[1]: Stopping logrotate_crond container... Feb 23 04:01:43 localhost systemd[1]: libpod-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.scope: Deactivated successfully. Feb 23 04:01:43 localhost systemd[1]: libpod-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.scope: Consumed 1.175s CPU time. Feb 23 04:01:43 localhost podman[110300]: 2026-02-23 09:01:43.536991161 +0000 UTC m=+0.079844275 container died b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Feb 23 04:01:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: Deactivated successfully. Feb 23 04:01:43 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5. Feb 23 04:01:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: No such file or directory Feb 23 04:01:43 localhost podman[110300]: 2026-02-23 09:01:43.595507562 +0000 UTC m=+0.138360656 container cleanup b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, distribution-scope=public) Feb 23 04:01:43 localhost podman[110300]: logrotate_crond Feb 23 04:01:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: No such file or directory Feb 23 04:01:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: No such file or directory Feb 23 04:01:43 localhost podman[110313]: 2026-02-23 09:01:43.639678557 +0000 UTC m=+0.085760289 container cleanup b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 04:01:43 localhost systemd[1]: libpod-conmon-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.scope: Deactivated successfully. Feb 23 04:01:43 localhost podman[110342]: error opening file `/run/crun/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5/status`: No such file or directory Feb 23 04:01:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.timer: No such file or directory Feb 23 04:01:43 localhost systemd[1]: b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: Failed to open /run/systemd/transient/b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5.service: No such file or directory Feb 23 04:01:43 localhost podman[110329]: 2026-02-23 09:01:43.760797965 +0000 UTC m=+0.082261310 container cleanup b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron) Feb 23 04:01:43 localhost podman[110329]: logrotate_crond Feb 23 04:01:43 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Feb 23 04:01:43 localhost systemd[1]: Stopped logrotate_crond container. Feb 23 04:01:44 localhost systemd[1]: var-lib-containers-storage-overlay-b2c770567d2f47629c218ae90d489529d9f3e3ed2618072d59a3365c20854653-merged.mount: Deactivated successfully. Feb 23 04:01:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0ff784fd3c516a86ea42c4b4923b5ea1f5bc05ac1e108d3d0d988d4d4a569d5-userdata-shm.mount: Deactivated successfully. Feb 23 04:01:44 localhost python3.9[110435]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:01:44 localhost systemd[1]: Reloading. Feb 23 04:01:44 localhost systemd-rc-local-generator[110464]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:01:44 localhost systemd-sysv-generator[110468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:01:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:01:45 localhost systemd[1]: Stopping metrics_qdr container... Feb 23 04:01:45 localhost kernel: qdrouterd[54599]: segfault at 0 ip 00007fd1a6c897cb sp 00007ffd4d99c650 error 4 in libc.so.6[7fd1a6c26000+175000] Feb 23 04:01:45 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Feb 23 04:01:45 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Feb 23 04:01:45 localhost systemd[1]: Started Process Core Dump (PID 110489/UID 0). Feb 23 04:01:45 localhost systemd-coredump[110490]: Resource limits disable core dumping for process 54599 (qdrouterd). Feb 23 04:01:45 localhost systemd-coredump[110490]: Process 54599 (qdrouterd) of user 42465 dumped core. Feb 23 04:01:45 localhost systemd[1]: systemd-coredump@0-110489-0.service: Deactivated successfully. Feb 23 04:01:45 localhost podman[110477]: 2026-02-23 09:01:45.28790388 +0000 UTC m=+0.244261241 container died f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5) Feb 23 04:01:45 localhost systemd[1]: libpod-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.scope: Deactivated successfully. Feb 23 04:01:45 localhost systemd[1]: libpod-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.scope: Consumed 31.088s CPU time. Feb 23 04:01:45 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: Deactivated successfully. Feb 23 04:01:45 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f. Feb 23 04:01:45 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: No such file or directory Feb 23 04:01:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f-userdata-shm.mount: Deactivated successfully. Feb 23 04:01:45 localhost podman[110477]: 2026-02-23 09:01:45.345200534 +0000 UTC m=+0.301557865 container cleanup f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 04:01:45 localhost podman[110477]: metrics_qdr Feb 23 04:01:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39641 DF PROTO=TCP SPT=49854 DPT=9105 SEQ=996332164 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB10060000000001030307) Feb 23 04:01:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 04:01:45 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: No such file or directory Feb 23 04:01:45 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: No such file or directory Feb 23 04:01:45 localhost podman[110494]: 2026-02-23 09:01:45.391343989 +0000 UTC m=+0.091723574 container cleanup f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Feb 23 04:01:45 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Feb 23 04:01:45 localhost systemd[1]: var-lib-containers-storage-overlay-72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d-merged.mount: Deactivated successfully. Feb 23 04:01:45 localhost systemd[1]: libpod-conmon-f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.scope: Deactivated successfully. Feb 23 04:01:45 localhost podman[110511]: 2026-02-23 09:01:45.481677391 +0000 UTC m=+0.094063119 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team) Feb 23 04:01:45 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.timer: No such file or directory Feb 23 04:01:45 localhost systemd[1]: f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: Failed to open /run/systemd/transient/f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f.service: No such file or directory Feb 23 04:01:45 localhost podman[110520]: 2026-02-23 09:01:45.501817317 +0000 UTC m=+0.069066300 container cleanup f84abeb91c350d20122a38351edd1e7d21b06536eac74f6a4979bae73ef8cd3f (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '90a8871bd317528138d212bd0375f6aa'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git) Feb 23 04:01:45 localhost podman[110520]: metrics_qdr Feb 23 04:01:45 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Feb 23 04:01:45 localhost systemd[1]: Stopped metrics_qdr container. Feb 23 04:01:45 localhost podman[110511]: 2026-02-23 09:01:45.88532933 +0000 UTC m=+0.497714998 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 04:01:45 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 04:01:46 localhost python3.9[110636]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:01:46 localhost python3.9[110729]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:01:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f0:86:db MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52466 SEQ=3556684315 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Feb 23 04:01:48 localhost python3.9[110822]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:01:49 localhost python3.9[110915]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:01:49 localhost systemd[1]: Reloading. Feb 23 04:01:49 localhost systemd-sysv-generator[110945]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:01:49 localhost systemd-rc-local-generator[110938]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:01:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:01:49 localhost systemd[1]: Stopping nova_compute container... Feb 23 04:01:50 localhost sshd[110967]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:01:53 localhost sshd[110969]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:01:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54195 DF PROTO=TCP SPT=42396 DPT=9102 SEQ=975616764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB32300000000001030307) Feb 23 04:01:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2003 DF PROTO=TCP SPT=33644 DPT=9882 SEQ=191684158 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB34060000000001030307) Feb 23 04:01:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54197 DF PROTO=TCP SPT=42396 DPT=9102 SEQ=975616764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB3E470000000001030307) Feb 23 04:02:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5157 DF PROTO=TCP SPT=42228 DPT=9100 SEQ=1852176895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB4CC60000000001030307) Feb 23 04:02:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5158 DF PROTO=TCP SPT=42228 DPT=9100 SEQ=1852176895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB54C60000000001030307) Feb 23 04:02:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59675 DF PROTO=TCP SPT=47758 DPT=9101 SEQ=3553128173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB61870000000001030307) Feb 23 04:02:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9885 DF PROTO=TCP SPT=46468 DPT=9882 SEQ=2010042467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB6DE30000000001030307) Feb 23 04:02:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9887 DF PROTO=TCP SPT=46468 DPT=9882 SEQ=2010042467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB7A060000000001030307) Feb 23 04:02:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:02:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:02:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 04:02:12 localhost podman[110972]: 2026-02-23 09:02:12.710279664 +0000 UTC m=+0.135426454 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, release=1766032510, version=17.1.13, architecture=x86_64) Feb 23 04:02:12 localhost podman[110972]: 2026-02-23 09:02:12.726566481 +0000 UTC m=+0.151713321 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.13, vendor=Red Hat, Inc.) Feb 23 04:02:12 localhost podman[110972]: unhealthy Feb 23 04:02:12 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:02:12 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:02:12 localhost podman[110990]: 2026-02-23 09:02:12.817892693 +0000 UTC m=+0.099384534 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, managed_by=tripleo_ansible, container_name=ovn_controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true) Feb 23 04:02:12 localhost podman[110991]: Error: container c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 is not running Feb 23 04:02:12 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Main process exited, code=exited, status=125/n/a Feb 23 04:02:12 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed with result 'exit-code'. Feb 23 04:02:12 localhost podman[110990]: 2026-02-23 09:02:12.861345345 +0000 UTC m=+0.142837196 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64) Feb 23 04:02:12 localhost podman[110990]: unhealthy Feb 23 04:02:12 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:02:12 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:02:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5160 DF PROTO=TCP SPT=42228 DPT=9100 SEQ=1852176895 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB84060000000001030307) Feb 23 04:02:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 04:02:16 localhost podman[111026]: 2026-02-23 09:02:16.390701491 +0000 UTC m=+0.071143664 container health_status 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Feb 23 04:02:16 localhost podman[111026]: 2026-02-23 09:02:16.781290065 +0000 UTC m=+0.461732238 container exec_died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, tcib_managed=true, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 04:02:16 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Deactivated successfully. Feb 23 04:02:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59677 DF PROTO=TCP SPT=47758 DPT=9101 SEQ=3553128173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDB92060000000001030307) Feb 23 04:02:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18675 DF PROTO=TCP SPT=36840 DPT=9102 SEQ=2096385036 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBA7600000000001030307) Feb 23 04:02:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9889 DF PROTO=TCP SPT=46468 DPT=9882 SEQ=2010042467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBAA070000000001030307) Feb 23 04:02:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18677 DF PROTO=TCP SPT=36840 DPT=9102 SEQ=2096385036 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBB3870000000001030307) Feb 23 04:02:30 localhost sshd[111126]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:02:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39510 DF PROTO=TCP SPT=35126 DPT=9100 SEQ=1814791593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBC1C60000000001030307) Feb 23 04:02:32 localhost podman[110955]: time="2026-02-23T09:02:32Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL" Feb 23 04:02:32 localhost systemd[1]: session-c11.scope: Deactivated successfully. Feb 23 04:02:32 localhost systemd[1]: libpod-c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.scope: Deactivated successfully. Feb 23 04:02:32 localhost systemd[1]: libpod-c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.scope: Consumed 39.939s CPU time. Feb 23 04:02:32 localhost podman[110955]: 2026-02-23 09:02:32.067029038 +0000 UTC m=+42.109890836 container died c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 04:02:32 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: Deactivated successfully. Feb 23 04:02:32 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442. Feb 23 04:02:32 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: No such file or directory Feb 23 04:02:32 localhost systemd[1]: var-lib-containers-storage-overlay-e9505ba141111701b3c5c0dc16acea1474e53d4b4405e45ed2eb48993537e49e-merged.mount: Deactivated successfully. Feb 23 04:02:32 localhost podman[110955]: 2026-02-23 09:02:32.137571653 +0000 UTC m=+42.180433371 container cleanup c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Feb 23 04:02:32 localhost podman[110955]: nova_compute Feb 23 04:02:32 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: No such file or directory Feb 23 04:02:32 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: No such file or directory Feb 23 04:02:32 localhost podman[111129]: 2026-02-23 09:02:32.174534863 +0000 UTC m=+0.089215236 container cleanup c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step5, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Feb 23 04:02:32 localhost systemd[1]: libpod-conmon-c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.scope: Deactivated successfully. Feb 23 04:02:32 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.timer: No such file or directory Feb 23 04:02:32 localhost systemd[1]: c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: Failed to open /run/systemd/transient/c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442.service: No such file or directory Feb 23 04:02:32 localhost podman[111144]: 2026-02-23 09:02:32.288430997 +0000 UTC m=+0.080350882 container cleanup c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z) Feb 23 04:02:32 localhost podman[111144]: nova_compute Feb 23 04:02:32 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Feb 23 04:02:32 localhost systemd[1]: Stopped nova_compute container. Feb 23 04:02:32 localhost systemd[1]: tripleo_nova_compute.service: Consumed 1.184s CPU time, no IO. Feb 23 04:02:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39511 DF PROTO=TCP SPT=35126 DPT=9100 SEQ=1814791593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBC9E60000000001030307) Feb 23 04:02:33 localhost python3.9[111248]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:02:33 localhost systemd[1]: Reloading. Feb 23 04:02:33 localhost systemd-rc-local-generator[111273]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:02:33 localhost systemd-sysv-generator[111280]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:02:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:02:33 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 04:02:33 localhost systemd[1]: Stopping nova_migration_target container... Feb 23 04:02:33 localhost recover_tripleo_nova_virtqemud[111291]: 61982 Feb 23 04:02:33 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 04:02:33 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 04:02:33 localhost systemd[1]: libpod-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.scope: Deactivated successfully. Feb 23 04:02:33 localhost systemd[1]: libpod-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.scope: Consumed 38.936s CPU time. Feb 23 04:02:33 localhost podman[111290]: 2026-02-23 09:02:33.666493505 +0000 UTC m=+0.081848127 container died 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 04:02:33 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: Deactivated successfully. Feb 23 04:02:33 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0. Feb 23 04:02:33 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: No such file or directory Feb 23 04:02:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0-userdata-shm.mount: Deactivated successfully. Feb 23 04:02:33 localhost systemd[1]: var-lib-containers-storage-overlay-4752b195a0319d00ad3a4bd86f4312afcec268e914950a9934c95f1e8044f1fa-merged.mount: Deactivated successfully. Feb 23 04:02:33 localhost podman[111290]: 2026-02-23 09:02:33.713187978 +0000 UTC m=+0.128542530 container cleanup 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=) Feb 23 04:02:33 localhost podman[111290]: nova_migration_target Feb 23 04:02:33 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: No such file or directory Feb 23 04:02:33 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: No such file or directory Feb 23 04:02:33 localhost podman[111303]: 2026-02-23 09:02:33.753399029 +0000 UTC m=+0.073442036 container cleanup 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute) Feb 23 04:02:33 localhost systemd[1]: libpod-conmon-0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.scope: Deactivated successfully. Feb 23 04:02:33 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.timer: No such file or directory Feb 23 04:02:33 localhost systemd[1]: 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: Failed to open /run/systemd/transient/0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0.service: No such file or directory Feb 23 04:02:33 localhost podman[111319]: 2026-02-23 09:02:33.862204945 +0000 UTC m=+0.077746620 container cleanup 0bd3f7b5a491669c18e4fa63fd0684f8c857d2056b267d67e8ca783e46101db0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, container_name=nova_migration_target) Feb 23 04:02:33 localhost podman[111319]: nova_migration_target Feb 23 04:02:33 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Feb 23 04:02:33 localhost systemd[1]: Stopped nova_migration_target container. Feb 23 04:02:34 localhost python3.9[111423]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:02:34 localhost systemd[1]: Reloading. Feb 23 04:02:34 localhost systemd-sysv-generator[111454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:02:34 localhost systemd-rc-local-generator[111448]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:02:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:02:35 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Feb 23 04:02:35 localhost systemd[1]: libpod-215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b.scope: Deactivated successfully. Feb 23 04:02:35 localhost podman[111464]: 2026-02-23 09:02:35.146176425 +0000 UTC m=+0.064858109 container died 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, config_id=tripleo_step3, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 04:02:35 localhost systemd[1]: tmp-crun.h6pGis.mount: Deactivated successfully. Feb 23 04:02:35 localhost podman[111464]: 2026-02-23 09:02:35.194470857 +0000 UTC m=+0.113152131 container cleanup 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, container_name=nova_virtlogd_wrapper, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3) Feb 23 04:02:35 localhost podman[111464]: nova_virtlogd_wrapper Feb 23 04:02:35 localhost podman[111477]: 2026-02-23 09:02:35.226113733 +0000 UTC m=+0.071469176 container cleanup 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, container_name=nova_virtlogd_wrapper, version=17.1.13) Feb 23 04:02:36 localhost systemd[1]: var-lib-containers-storage-overlay-298dde95429fec5a28a160c47b2187ae70f7a465a9ef6c2faaa9c2f451a444ab-merged.mount: Deactivated successfully. Feb 23 04:02:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b-userdata-shm.mount: Deactivated successfully. Feb 23 04:02:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53887 DF PROTO=TCP SPT=35042 DPT=9101 SEQ=4242459794 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBD6C60000000001030307) Feb 23 04:02:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43523 DF PROTO=TCP SPT=49306 DPT=9882 SEQ=474278600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBE3130000000001030307) Feb 23 04:02:41 localhost sshd[111495]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:02:42 localhost systemd[1]: Stopping User Manager for UID 0... Feb 23 04:02:42 localhost systemd[83969]: Activating special unit Exit the Session... Feb 23 04:02:42 localhost systemd[83969]: Removed slice User Background Tasks Slice. Feb 23 04:02:42 localhost systemd[83969]: Stopped target Main User Target. Feb 23 04:02:42 localhost systemd[83969]: Stopped target Basic System. Feb 23 04:02:42 localhost systemd[83969]: Stopped target Paths. Feb 23 04:02:42 localhost systemd[83969]: Stopped target Sockets. Feb 23 04:02:42 localhost systemd[83969]: Stopped target Timers. Feb 23 04:02:42 localhost systemd[83969]: Stopped Daily Cleanup of User's Temporary Directories. Feb 23 04:02:42 localhost systemd[83969]: Closed D-Bus User Message Bus Socket. Feb 23 04:02:42 localhost systemd[83969]: Stopped Create User's Volatile Files and Directories. Feb 23 04:02:42 localhost systemd[83969]: Removed slice User Application Slice. Feb 23 04:02:42 localhost systemd[83969]: Reached target Shutdown. Feb 23 04:02:42 localhost systemd[83969]: Finished Exit the Session. Feb 23 04:02:42 localhost systemd[83969]: Reached target Exit the Session. Feb 23 04:02:42 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 23 04:02:42 localhost systemd[1]: Stopped User Manager for UID 0. Feb 23 04:02:42 localhost systemd[1]: user@0.service: Consumed 4.286s CPU time, no IO. Feb 23 04:02:42 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 23 04:02:42 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 23 04:02:42 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 23 04:02:42 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 23 04:02:42 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 23 04:02:42 localhost systemd[1]: user-0.slice: Consumed 5.290s CPU time. Feb 23 04:02:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43525 DF PROTO=TCP SPT=49306 DPT=9882 SEQ=474278600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBEF060000000001030307) Feb 23 04:02:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:02:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:02:42 localhost systemd[1]: tmp-crun.asB6Pc.mount: Deactivated successfully. Feb 23 04:02:42 localhost podman[111498]: 2026-02-23 09:02:42.958219546 +0000 UTC m=+0.108008414 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Feb 23 04:02:42 localhost podman[111498]: 2026-02-23 09:02:42.980146393 +0000 UTC m=+0.129935291 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 04:02:42 localhost podman[111498]: unhealthy Feb 23 04:02:42 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:02:42 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:02:43 localhost systemd[1]: tmp-crun.IJXuTJ.mount: Deactivated successfully. Feb 23 04:02:43 localhost podman[111516]: 2026-02-23 09:02:43.058166553 +0000 UTC m=+0.094228014 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9) Feb 23 04:02:43 localhost podman[111516]: 2026-02-23 09:02:43.077188391 +0000 UTC m=+0.113249882 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:02:43 localhost podman[111516]: unhealthy Feb 23 04:02:43 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:02:43 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:02:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39513 DF PROTO=TCP SPT=35126 DPT=9100 SEQ=1814791593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDBFA060000000001030307) Feb 23 04:02:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53889 DF PROTO=TCP SPT=35042 DPT=9101 SEQ=4242459794 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC06060000000001030307) Feb 23 04:02:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7561 DF PROTO=TCP SPT=53886 DPT=9102 SEQ=1962421276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC1C900000000001030307) Feb 23 04:02:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43527 DF PROTO=TCP SPT=49306 DPT=9882 SEQ=474278600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC20060000000001030307) Feb 23 04:02:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7563 DF PROTO=TCP SPT=53886 DPT=9102 SEQ=1962421276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC28860000000001030307) Feb 23 04:03:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46947 DF PROTO=TCP SPT=37874 DPT=9100 SEQ=2996360404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC37060000000001030307) Feb 23 04:03:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46948 DF PROTO=TCP SPT=37874 DPT=9100 SEQ=2996360404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC3F060000000001030307) Feb 23 04:03:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43209 DF PROTO=TCP SPT=45530 DPT=9101 SEQ=900839925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC4C060000000001030307) Feb 23 04:03:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7565 DF PROTO=TCP SPT=53886 DPT=9102 SEQ=1962421276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC58060000000001030307) Feb 23 04:03:10 localhost sshd[111539]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:03:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45126 DF PROTO=TCP SPT=46666 DPT=9882 SEQ=3373547326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC64470000000001030307) Feb 23 04:03:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:03:13 localhost podman[111541]: 2026-02-23 09:03:13.156435408 +0000 UTC m=+0.081655943 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team) Feb 23 04:03:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:03:13 localhost podman[111541]: 2026-02-23 09:03:13.176351014 +0000 UTC m=+0.101571589 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, url=https://www.redhat.com) Feb 23 04:03:13 localhost podman[111541]: unhealthy Feb 23 04:03:13 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:03:13 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:03:13 localhost podman[111560]: 2026-02-23 09:03:13.253047125 +0000 UTC m=+0.084446668 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, vcs-type=git, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=) Feb 23 04:03:13 localhost podman[111560]: 2026-02-23 09:03:13.294267038 +0000 UTC m=+0.125666571 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Feb 23 04:03:13 localhost podman[111560]: unhealthy Feb 23 04:03:13 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:03:13 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:03:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29707 DF PROTO=TCP SPT=44550 DPT=9105 SEQ=3695610875 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC6E060000000001030307) Feb 23 04:03:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43211 DF PROTO=TCP SPT=45530 DPT=9101 SEQ=900839925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC7C070000000001030307) Feb 23 04:03:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39298 DF PROTO=TCP SPT=36766 DPT=9102 SEQ=969065005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC91C00000000001030307) Feb 23 04:03:24 localhost sshd[111582]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:03:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45128 DF PROTO=TCP SPT=46666 DPT=9882 SEQ=3373547326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC94070000000001030307) Feb 23 04:03:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39300 DF PROTO=TCP SPT=36766 DPT=9102 SEQ=969065005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDC9DC60000000001030307) Feb 23 04:03:30 localhost podman[111687]: 2026-02-23 09:03:30.444127607 +0000 UTC m=+0.096610537 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:03:30 localhost podman[111687]: 2026-02-23 09:03:30.521159358 +0000 UTC m=+0.173642338 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, RELEASE=main, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:03:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14039 DF PROTO=TCP SPT=60982 DPT=9100 SEQ=1058392723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCAC460000000001030307) Feb 23 04:03:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14040 DF PROTO=TCP SPT=60982 DPT=9100 SEQ=1058392723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCB4460000000001030307) Feb 23 04:03:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21755 DF PROTO=TCP SPT=47746 DPT=9101 SEQ=3930008034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCC1060000000001030307) Feb 23 04:03:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36762 DF PROTO=TCP SPT=53008 DPT=9882 SEQ=2756459503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCCD730000000001030307) Feb 23 04:03:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36764 DF PROTO=TCP SPT=53008 DPT=9882 SEQ=2756459503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCD9860000000001030307) Feb 23 04:03:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:03:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:03:43 localhost podman[111830]: 2026-02-23 09:03:43.4161829 +0000 UTC m=+0.085122787 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4) Feb 23 04:03:43 localhost podman[111830]: 2026-02-23 09:03:43.431207397 +0000 UTC m=+0.100147314 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 04:03:43 localhost podman[111830]: unhealthy Feb 23 04:03:43 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:03:43 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:03:43 localhost podman[111829]: 2026-02-23 09:03:43.515627223 +0000 UTC m=+0.187340295 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13) Feb 23 04:03:43 localhost podman[111829]: 2026-02-23 09:03:43.533019741 +0000 UTC m=+0.204732783 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1766032510, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5) Feb 23 04:03:43 localhost podman[111829]: unhealthy Feb 23 04:03:43 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:03:43 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:03:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3698 DF PROTO=TCP SPT=38786 DPT=9105 SEQ=4223480551 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCE4060000000001030307) Feb 23 04:03:46 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Feb 23 04:03:46 localhost recover_tripleo_nova_virtqemud[111869]: 61982 Feb 23 04:03:46 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Feb 23 04:03:46 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Feb 23 04:03:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21757 DF PROTO=TCP SPT=47746 DPT=9101 SEQ=3930008034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDCF2070000000001030307) Feb 23 04:03:51 localhost sshd[111870]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:03:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17609 DF PROTO=TCP SPT=50946 DPT=9102 SEQ=2500752756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD06F00000000001030307) Feb 23 04:03:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36766 DF PROTO=TCP SPT=53008 DPT=9882 SEQ=2756459503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD0A060000000001030307) Feb 23 04:03:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17611 DF PROTO=TCP SPT=50946 DPT=9102 SEQ=2500752756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD13060000000001030307) Feb 23 04:03:59 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Feb 23 04:03:59 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61197 (conmon) with signal SIGKILL. Feb 23 04:03:59 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Feb 23 04:03:59 localhost systemd[1]: libpod-conmon-215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b.scope: Deactivated successfully. Feb 23 04:03:59 localhost podman[111883]: error opening file `/run/crun/215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b/status`: No such file or directory Feb 23 04:03:59 localhost podman[111872]: 2026-02-23 09:03:59.403571792 +0000 UTC m=+0.074187355 container cleanup 215843e184902f90be3eb81ce66aa24806af3685bfe1167fb47a6baf7e2cdc6b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 04:03:59 localhost podman[111872]: nova_virtlogd_wrapper Feb 23 04:03:59 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Feb 23 04:03:59 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Feb 23 04:04:00 localhost python3.9[111976]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:04:00 localhost systemd[1]: Reloading. Feb 23 04:04:00 localhost systemd-rc-local-generator[112004]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:04:00 localhost systemd-sysv-generator[112009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:04:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:04:00 localhost systemd[1]: Stopping nova_virtnodedevd container... Feb 23 04:04:00 localhost systemd[1]: libpod-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2.scope: Deactivated successfully. Feb 23 04:04:00 localhost systemd[1]: libpod-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2.scope: Consumed 1.792s CPU time. Feb 23 04:04:00 localhost podman[112017]: 2026-02-23 09:04:00.678908281 +0000 UTC m=+0.103532478 container died 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=nova_virtnodedevd, build-date=2026-01-12T23:31:49Z, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Feb 23 04:04:00 localhost podman[112017]: 2026-02-23 09:04:00.715939907 +0000 UTC m=+0.140564044 container cleanup 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 04:04:00 localhost podman[112017]: nova_virtnodedevd Feb 23 04:04:00 localhost podman[112031]: 2026-02-23 09:04:00.774857428 +0000 UTC m=+0.079409345 container cleanup 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=nova_virtnodedevd, name=rhosp-rhel9/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 04:04:00 localhost systemd[1]: libpod-conmon-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2.scope: Deactivated successfully. Feb 23 04:04:00 localhost podman[112059]: error opening file `/run/crun/930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2/status`: No such file or directory Feb 23 04:04:00 localhost podman[112048]: 2026-02-23 09:04:00.888598414 +0000 UTC m=+0.074566417 container cleanup 930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, config_id=tripleo_step3, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtnodedevd, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible) Feb 23 04:04:00 localhost podman[112048]: nova_virtnodedevd Feb 23 04:04:00 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Feb 23 04:04:00 localhost systemd[1]: Stopped nova_virtnodedevd container. Feb 23 04:04:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62767 DF PROTO=TCP SPT=35560 DPT=9100 SEQ=267781514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD21870000000001030307) Feb 23 04:04:01 localhost python3.9[112152]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:04:01 localhost systemd[1]: var-lib-containers-storage-overlay-49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7-merged.mount: Deactivated successfully. Feb 23 04:04:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-930c6611f5962d33efb440404474ddbbb597649f3d0829a0c1f4e90bb33c2ea2-userdata-shm.mount: Deactivated successfully. Feb 23 04:04:01 localhost systemd[1]: Reloading. Feb 23 04:04:01 localhost systemd-rc-local-generator[112178]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:04:01 localhost systemd-sysv-generator[112183]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:04:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:04:02 localhost systemd[1]: Stopping nova_virtproxyd container... Feb 23 04:04:02 localhost systemd[1]: tmp-crun.OJD2jf.mount: Deactivated successfully. Feb 23 04:04:02 localhost systemd[1]: libpod-33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a.scope: Deactivated successfully. Feb 23 04:04:02 localhost podman[112193]: 2026-02-23 09:04:02.159946181 +0000 UTC m=+0.082770167 container died 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container) Feb 23 04:04:02 localhost podman[112193]: 2026-02-23 09:04:02.195573264 +0000 UTC m=+0.118397230 container cleanup 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=nova_virtproxyd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 04:04:02 localhost podman[112193]: nova_virtproxyd Feb 23 04:04:02 localhost podman[112208]: 2026-02-23 09:04:02.247011737 +0000 UTC m=+0.073522945 container cleanup 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtproxyd) Feb 23 04:04:02 localhost systemd[1]: libpod-conmon-33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a.scope: Deactivated successfully. Feb 23 04:04:02 localhost podman[112238]: error opening file `/run/crun/33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a/status`: No such file or directory Feb 23 04:04:02 localhost podman[112226]: 2026-02-23 09:04:02.355824224 +0000 UTC m=+0.073340379 container cleanup 33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, container_name=nova_virtproxyd, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 04:04:02 localhost podman[112226]: nova_virtproxyd Feb 23 04:04:02 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Feb 23 04:04:02 localhost systemd[1]: Stopped nova_virtproxyd container. Feb 23 04:04:02 localhost systemd[1]: var-lib-containers-storage-overlay-2be3d0bba76fb52fbeba06c336dea0a1698df79193676f245ce702f60a0a9fa3-merged.mount: Deactivated successfully. Feb 23 04:04:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-33b24c0b43815062cb55aef2cae21d9e0731c73ede479175c35b6a75b4421d9a-userdata-shm.mount: Deactivated successfully. Feb 23 04:04:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62768 DF PROTO=TCP SPT=35560 DPT=9100 SEQ=267781514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD29860000000001030307) Feb 23 04:04:03 localhost python3.9[112331]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:04:03 localhost systemd[1]: Reloading. Feb 23 04:04:03 localhost systemd-rc-local-generator[112356]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:04:03 localhost systemd-sysv-generator[112361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:04:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:04:03 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Feb 23 04:04:03 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Feb 23 04:04:03 localhost systemd[1]: Stopping nova_virtqemud container... Feb 23 04:04:03 localhost systemd[1]: libpod-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468.scope: Deactivated successfully. Feb 23 04:04:03 localhost systemd[1]: libpod-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468.scope: Consumed 3.232s CPU time. Feb 23 04:04:03 localhost podman[112372]: 2026-02-23 09:04:03.678057238 +0000 UTC m=+0.080160967 container stop ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2026-01-12T23:31:49Z, container_name=nova_virtqemud, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 04:04:03 localhost podman[112372]: 2026-02-23 09:04:03.709178484 +0000 UTC m=+0.111282183 container died ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, build-date=2026-01-12T23:31:49Z, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtqemud, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public) Feb 23 04:04:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468-userdata-shm.mount: Deactivated successfully. Feb 23 04:04:03 localhost podman[112372]: 2026-02-23 09:04:03.740189376 +0000 UTC m=+0.142293075 container cleanup ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, container_name=nova_virtqemud, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:31:49Z, version=17.1.13) Feb 23 04:04:03 localhost podman[112372]: nova_virtqemud Feb 23 04:04:03 localhost podman[112386]: 2026-02-23 09:04:03.778328936 +0000 UTC m=+0.084063976 container cleanup ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, container_name=nova_virtqemud, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0) Feb 23 04:04:04 localhost systemd[1]: var-lib-containers-storage-overlay-7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff-merged.mount: Deactivated successfully. Feb 23 04:04:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35724 DF PROTO=TCP SPT=34998 DPT=9101 SEQ=1217985343 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD36470000000001030307) Feb 23 04:04:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52401 DF PROTO=TCP SPT=39404 DPT=9882 SEQ=1983269621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD42A30000000001030307) Feb 23 04:04:11 localhost sshd[112403]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:04:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52403 DF PROTO=TCP SPT=39404 DPT=9882 SEQ=1983269621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD4EC70000000001030307) Feb 23 04:04:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:04:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:04:13 localhost systemd[1]: tmp-crun.KvVvps.mount: Deactivated successfully. Feb 23 04:04:13 localhost podman[112406]: 2026-02-23 09:04:13.730964565 +0000 UTC m=+0.151726892 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Feb 23 04:04:13 localhost podman[112406]: 2026-02-23 09:04:13.748792558 +0000 UTC m=+0.169554845 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5) Feb 23 04:04:13 localhost podman[112406]: unhealthy Feb 23 04:04:13 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:04:13 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:04:13 localhost podman[112405]: 2026-02-23 09:04:13.684844794 +0000 UTC m=+0.106400935 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., managed_by=tripleo_ansible) Feb 23 04:04:13 localhost podman[112405]: 2026-02-23 09:04:13.820364162 +0000 UTC m=+0.241920293 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:04:13 localhost podman[112405]: unhealthy Feb 23 04:04:13 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:04:13 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:04:14 localhost systemd[1]: tmp-crun.x8qa7a.mount: Deactivated successfully. Feb 23 04:04:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62770 DF PROTO=TCP SPT=35560 DPT=9100 SEQ=267781514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD5A060000000001030307) Feb 23 04:04:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35726 DF PROTO=TCP SPT=34998 DPT=9101 SEQ=1217985343 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD66060000000001030307) Feb 23 04:04:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51737 DF PROTO=TCP SPT=36170 DPT=9102 SEQ=3860819404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD7C200000000001030307) Feb 23 04:04:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52405 DF PROTO=TCP SPT=39404 DPT=9882 SEQ=1983269621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD7E070000000001030307) Feb 23 04:04:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51739 DF PROTO=TCP SPT=36170 DPT=9102 SEQ=3860819404 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD88460000000001030307) Feb 23 04:04:30 localhost sshd[112446]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:04:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21407 DF PROTO=TCP SPT=42254 DPT=9100 SEQ=2964484347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD96860000000001030307) Feb 23 04:04:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21408 DF PROTO=TCP SPT=42254 DPT=9100 SEQ=2964484347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDD9E860000000001030307) Feb 23 04:04:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62864 DF PROTO=TCP SPT=33276 DPT=9101 SEQ=3462267381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDAB870000000001030307) Feb 23 04:04:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55635 DF PROTO=TCP SPT=53614 DPT=9882 SEQ=2399849443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDB7D30000000001030307) Feb 23 04:04:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55637 DF PROTO=TCP SPT=53614 DPT=9882 SEQ=2399849443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDC3C70000000001030307) Feb 23 04:04:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:04:43 localhost systemd[1]: tmp-crun.e7whMF.mount: Deactivated successfully. Feb 23 04:04:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:04:43 localhost podman[112525]: 2026-02-23 09:04:43.944899666 +0000 UTC m=+0.115046977 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1) Feb 23 04:04:43 localhost podman[112525]: 2026-02-23 09:04:43.96113687 +0000 UTC m=+0.131284181 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:04:43 localhost podman[112525]: unhealthy Feb 23 04:04:43 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:04:43 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:04:44 localhost podman[112541]: 2026-02-23 09:04:44.035982633 +0000 UTC m=+0.079306220 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, vcs-type=git, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true) Feb 23 04:04:44 localhost podman[112541]: 2026-02-23 09:04:44.057291811 +0000 UTC m=+0.100615388 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc.) Feb 23 04:04:44 localhost podman[112541]: unhealthy Feb 23 04:04:44 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:04:44 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:04:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21410 DF PROTO=TCP SPT=42254 DPT=9100 SEQ=2964484347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDCE060000000001030307) Feb 23 04:04:48 localhost sshd[112563]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:04:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62866 DF PROTO=TCP SPT=33276 DPT=9101 SEQ=3462267381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDDC060000000001030307) Feb 23 04:04:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5031 DF PROTO=TCP SPT=51082 DPT=9102 SEQ=4270415838 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDF1500000000001030307) Feb 23 04:04:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55639 DF PROTO=TCP SPT=53614 DPT=9882 SEQ=2399849443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDF4060000000001030307) Feb 23 04:04:54 localhost sshd[112565]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:04:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5033 DF PROTO=TCP SPT=51082 DPT=9102 SEQ=4270415838 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDDFD460000000001030307) Feb 23 04:05:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23663 DF PROTO=TCP SPT=38396 DPT=9100 SEQ=343615187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE0BC60000000001030307) Feb 23 04:05:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23664 DF PROTO=TCP SPT=38396 DPT=9100 SEQ=343615187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE13C60000000001030307) Feb 23 04:05:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14849 DF PROTO=TCP SPT=39344 DPT=9101 SEQ=3084715503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE20C60000000001030307) Feb 23 04:05:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2410 DF PROTO=TCP SPT=39234 DPT=9882 SEQ=2341706989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE2D030000000001030307) Feb 23 04:05:11 localhost sshd[112567]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:05:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2412 DF PROTO=TCP SPT=39234 DPT=9882 SEQ=2341706989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE39060000000001030307) Feb 23 04:05:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:05:14 localhost podman[112569]: 2026-02-23 09:05:14.160558619 +0000 UTC m=+0.085859161 container health_status 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, release=1766032510) Feb 23 04:05:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:05:14 localhost podman[112569]: 2026-02-23 09:05:14.179409181 +0000 UTC m=+0.104709743 container exec_died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4) Feb 23 04:05:14 localhost podman[112569]: unhealthy Feb 23 04:05:14 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:05:14 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed with result 'exit-code'. Feb 23 04:05:14 localhost podman[112589]: 2026-02-23 09:05:14.258821705 +0000 UTC m=+0.075174276 container health_status 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5) Feb 23 04:05:14 localhost podman[112589]: 2026-02-23 09:05:14.299714087 +0000 UTC m=+0.116066668 container exec_died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 04:05:14 localhost podman[112589]: unhealthy Feb 23 04:05:14 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:05:14 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed with result 'exit-code'. Feb 23 04:05:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10254 DF PROTO=TCP SPT=60970 DPT=9105 SEQ=4164321857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE44060000000001030307) Feb 23 04:05:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14851 DF PROTO=TCP SPT=39344 DPT=9101 SEQ=3084715503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE50060000000001030307) Feb 23 04:05:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42740 DF PROTO=TCP SPT=33396 DPT=9102 SEQ=2293908764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE66800000000001030307) Feb 23 04:05:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2414 DF PROTO=TCP SPT=39234 DPT=9882 SEQ=2341706989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE6A060000000001030307) Feb 23 04:05:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42742 DF PROTO=TCP SPT=33396 DPT=9102 SEQ=2293908764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE72860000000001030307) Feb 23 04:05:27 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Feb 23 04:05:27 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 61978 (conmon) with signal SIGKILL. Feb 23 04:05:27 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Feb 23 04:05:27 localhost systemd[1]: libpod-conmon-ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468.scope: Deactivated successfully. Feb 23 04:05:27 localhost podman[112620]: error opening file `/run/crun/ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468/status`: No such file or directory Feb 23 04:05:27 localhost systemd[1]: tmp-crun.EtWDjp.mount: Deactivated successfully. Feb 23 04:05:27 localhost podman[112609]: 2026-02-23 09:05:27.910698368 +0000 UTC m=+0.083325413 container cleanup ac0f39392edf3bb25f5f81a87a3332c629d29b63674f9d0e6bbbde82fcbac468 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, container_name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Feb 23 04:05:27 localhost podman[112609]: nova_virtqemud Feb 23 04:05:27 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Feb 23 04:05:27 localhost systemd[1]: Stopped nova_virtqemud container. Feb 23 04:05:28 localhost python3.9[112713]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:05:28 localhost systemd[1]: Reloading. Feb 23 04:05:28 localhost systemd-rc-local-generator[112736]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:05:28 localhost systemd-sysv-generator[112743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:05:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:05:29 localhost python3.9[112843]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:05:29 localhost systemd[1]: Reloading. Feb 23 04:05:29 localhost systemd-rc-local-generator[112866]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:05:29 localhost systemd-sysv-generator[112870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:05:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:05:30 localhost systemd[1]: Stopping nova_virtsecretd container... Feb 23 04:05:30 localhost systemd[1]: libpod-c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f.scope: Deactivated successfully. Feb 23 04:05:30 localhost podman[112883]: 2026-02-23 09:05:30.213219754 +0000 UTC m=+0.064081658 container died c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, container_name=nova_virtsecretd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible) Feb 23 04:05:30 localhost podman[112883]: 2026-02-23 09:05:30.263111231 +0000 UTC m=+0.113973125 container cleanup c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtsecretd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Feb 23 04:05:30 localhost podman[112883]: nova_virtsecretd Feb 23 04:05:30 localhost podman[112897]: 2026-02-23 09:05:30.284132409 +0000 UTC m=+0.059720405 container cleanup c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, version=17.1.13, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5) Feb 23 04:05:30 localhost systemd[1]: libpod-conmon-c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f.scope: Deactivated successfully. Feb 23 04:05:30 localhost podman[112926]: error opening file `/run/crun/c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f/status`: No such file or directory Feb 23 04:05:30 localhost podman[112914]: 2026-02-23 09:05:30.401002481 +0000 UTC m=+0.076642499 container cleanup c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, release=1766032510, version=17.1.13, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_virtsecretd, name=rhosp-rhel9/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20260112.1) Feb 23 04:05:30 localhost podman[112914]: nova_virtsecretd Feb 23 04:05:30 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Feb 23 04:05:30 localhost systemd[1]: Stopped nova_virtsecretd container. Feb 23 04:05:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55035 DF PROTO=TCP SPT=59978 DPT=9100 SEQ=1291074508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE81060000000001030307) Feb 23 04:05:31 localhost python3.9[113019]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:05:31 localhost systemd[1]: var-lib-containers-storage-overlay-fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1-merged.mount: Deactivated successfully. Feb 23 04:05:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c5df1accada064756d125bdbb5695a73715fe028f05153add7eb5ec7749f050f-userdata-shm.mount: Deactivated successfully. Feb 23 04:05:31 localhost systemd[1]: Reloading. Feb 23 04:05:31 localhost systemd-sysv-generator[113052]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:05:31 localhost systemd-rc-local-generator[113047]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:05:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:05:31 localhost systemd[1]: Stopping nova_virtstoraged container... Feb 23 04:05:31 localhost systemd[1]: libpod-5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843.scope: Deactivated successfully. Feb 23 04:05:31 localhost podman[113060]: 2026-02-23 09:05:31.658293752 +0000 UTC m=+0.084386036 container died 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, container_name=nova_virtstoraged) Feb 23 04:05:31 localhost systemd[1]: tmp-crun.awQY0i.mount: Deactivated successfully. Feb 23 04:05:31 localhost podman[113060]: 2026-02-23 09:05:31.708144237 +0000 UTC m=+0.134236491 container cleanup 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_virtstoraged, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team) Feb 23 04:05:31 localhost podman[113060]: nova_virtstoraged Feb 23 04:05:31 localhost podman[113076]: 2026-02-23 09:05:31.756638971 +0000 UTC m=+0.086975475 container cleanup 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, version=17.1.13, container_name=nova_virtstoraged, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe) Feb 23 04:05:31 localhost systemd[1]: libpod-conmon-5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843.scope: Deactivated successfully. Feb 23 04:05:31 localhost podman[113105]: error opening file `/run/crun/5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843/status`: No such file or directory Feb 23 04:05:31 localhost podman[113093]: 2026-02-23 09:05:31.862479937 +0000 UTC m=+0.073775613 container cleanup 5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:05:31 localhost podman[113093]: nova_virtstoraged Feb 23 04:05:31 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Feb 23 04:05:31 localhost systemd[1]: Stopped nova_virtstoraged container. Feb 23 04:05:32 localhost systemd[1]: var-lib-containers-storage-overlay-34d62c030d25095ae1697db07157c262435d04349696135717e45f6132a7e460-merged.mount: Deactivated successfully. Feb 23 04:05:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5312b8f35cc3bb510b53656211ca10c6fa5b207da4896c4ba9d4afdec879f843-userdata-shm.mount: Deactivated successfully. Feb 23 04:05:32 localhost python3.9[113198]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:05:32 localhost systemd[1]: Reloading. Feb 23 04:05:32 localhost systemd-rc-local-generator[113221]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:05:32 localhost systemd-sysv-generator[113226]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:05:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:05:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55036 DF PROTO=TCP SPT=59978 DPT=9100 SEQ=1291074508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE89060000000001030307) Feb 23 04:05:33 localhost systemd[1]: Stopping ovn_controller container... Feb 23 04:05:33 localhost systemd[1]: tmp-crun.v0iYv9.mount: Deactivated successfully. Feb 23 04:05:33 localhost systemd[1]: libpod-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.scope: Deactivated successfully. Feb 23 04:05:33 localhost systemd[1]: libpod-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.scope: Consumed 3.367s CPU time. Feb 23 04:05:33 localhost podman[113239]: 2026-02-23 09:05:33.152157511 +0000 UTC m=+0.121616956 container died 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:05:33 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: Deactivated successfully. Feb 23 04:05:33 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e. Feb 23 04:05:33 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: No such file or directory Feb 23 04:05:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e-userdata-shm.mount: Deactivated successfully. Feb 23 04:05:33 localhost systemd[1]: var-lib-containers-storage-overlay-4c9543ebc437e655c5d4bfa732d117b93efd0b526ada85fb52dfe4d58e51e764-merged.mount: Deactivated successfully. Feb 23 04:05:33 localhost podman[113239]: 2026-02-23 09:05:33.204911555 +0000 UTC m=+0.174370980 container cleanup 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., container_name=ovn_controller, release=1766032510, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Feb 23 04:05:33 localhost podman[113239]: ovn_controller Feb 23 04:05:33 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: No such file or directory Feb 23 04:05:33 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: No such file or directory Feb 23 04:05:33 localhost podman[113252]: 2026-02-23 09:05:33.242470356 +0000 UTC m=+0.077771634 container cleanup 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4) Feb 23 04:05:33 localhost systemd[1]: libpod-conmon-1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.scope: Deactivated successfully. Feb 23 04:05:33 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.timer: No such file or directory Feb 23 04:05:33 localhost systemd[1]: 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: Failed to open /run/systemd/transient/1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e.service: No such file or directory Feb 23 04:05:33 localhost podman[113266]: 2026-02-23 09:05:33.345224019 +0000 UTC m=+0.071963148 container cleanup 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=) Feb 23 04:05:33 localhost podman[113266]: ovn_controller Feb 23 04:05:33 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Feb 23 04:05:33 localhost systemd[1]: Stopped ovn_controller container. Feb 23 04:05:34 localhost python3.9[113371]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:05:35 localhost systemd[1]: Reloading. Feb 23 04:05:35 localhost systemd-rc-local-generator[113395]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:05:35 localhost systemd-sysv-generator[113403]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:05:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:05:35 localhost systemd[1]: Stopping ovn_metadata_agent container... Feb 23 04:05:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17008 DF PROTO=TCP SPT=40828 DPT=9101 SEQ=3887796689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDE96060000000001030307) Feb 23 04:05:36 localhost systemd[1]: libpod-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.scope: Deactivated successfully. Feb 23 04:05:36 localhost systemd[1]: libpod-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.scope: Consumed 13.160s CPU time. Feb 23 04:05:36 localhost podman[113412]: 2026-02-23 09:05:36.515765045 +0000 UTC m=+0.924393054 container died 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13) Feb 23 04:05:36 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: Deactivated successfully. Feb 23 04:05:36 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9. Feb 23 04:05:36 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: No such file or directory Feb 23 04:05:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9-userdata-shm.mount: Deactivated successfully. Feb 23 04:05:36 localhost podman[113412]: 2026-02-23 09:05:36.648330493 +0000 UTC m=+1.056958442 container cleanup 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1766032510) Feb 23 04:05:36 localhost podman[113412]: ovn_metadata_agent Feb 23 04:05:36 localhost sshd[113469]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:05:36 localhost systemd[1]: var-lib-containers-storage-overlay-6ccd32925d3fd52f95d80d5d3005423a627f4aa5e2e72537587c6c2e01c55ed4-merged.mount: Deactivated successfully. Feb 23 04:05:36 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: No such file or directory Feb 23 04:05:36 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: No such file or directory Feb 23 04:05:36 localhost podman[113450]: 2026-02-23 09:05:36.677795089 +0000 UTC m=+0.150162235 container cleanup 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4) Feb 23 04:05:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42744 DF PROTO=TCP SPT=33396 DPT=9102 SEQ=2293908764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEA2060000000001030307) Feb 23 04:05:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33114 DF PROTO=TCP SPT=58308 DPT=9882 SEQ=1548170034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEAE470000000001030307) Feb 23 04:05:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31859 DF PROTO=TCP SPT=46122 DPT=9105 SEQ=4028789398 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEB8060000000001030307) Feb 23 04:05:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17010 DF PROTO=TCP SPT=40828 DPT=9101 SEQ=3887796689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEC6060000000001030307) Feb 23 04:05:51 localhost sshd[113520]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:05:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8423 DF PROTO=TCP SPT=54902 DPT=9102 SEQ=2603747387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEDBB00000000001030307) Feb 23 04:05:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33116 DF PROTO=TCP SPT=58308 DPT=9882 SEQ=1548170034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEDE060000000001030307) Feb 23 04:05:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8425 DF PROTO=TCP SPT=54902 DPT=9102 SEQ=2603747387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEE7C60000000001030307) Feb 23 04:06:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30169 DF PROTO=TCP SPT=60204 DPT=9100 SEQ=2391636339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEF6460000000001030307) Feb 23 04:06:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30170 DF PROTO=TCP SPT=60204 DPT=9100 SEQ=2391636339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDEFE460000000001030307) Feb 23 04:06:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18536 DF PROTO=TCP SPT=35264 DPT=9101 SEQ=1628879478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF0B060000000001030307) Feb 23 04:06:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2540 DF PROTO=TCP SPT=57692 DPT=9882 SEQ=742997640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF17630000000001030307) Feb 23 04:06:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2542 DF PROTO=TCP SPT=57692 DPT=9882 SEQ=742997640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF23860000000001030307) Feb 23 04:06:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30172 DF PROTO=TCP SPT=60204 DPT=9100 SEQ=2391636339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF2E060000000001030307) Feb 23 04:06:15 localhost sshd[113522]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:06:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18538 DF PROTO=TCP SPT=35264 DPT=9101 SEQ=1628879478 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF3C060000000001030307) Feb 23 04:06:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10826 DF PROTO=TCP SPT=56876 DPT=9102 SEQ=4214466874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF50E20000000001030307) Feb 23 04:06:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2544 DF PROTO=TCP SPT=57692 DPT=9882 SEQ=742997640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF54070000000001030307) Feb 23 04:06:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10828 DF PROTO=TCP SPT=56876 DPT=9102 SEQ=4214466874 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF5D060000000001030307) Feb 23 04:06:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21261 DF PROTO=TCP SPT=58144 DPT=9100 SEQ=1841808314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF6B460000000001030307) Feb 23 04:06:31 localhost sshd[113524]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:06:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21262 DF PROTO=TCP SPT=58144 DPT=9100 SEQ=1841808314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF73460000000001030307) Feb 23 04:06:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16432 DF PROTO=TCP SPT=38864 DPT=9101 SEQ=1068486902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF80460000000001030307) Feb 23 04:06:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58685 DF PROTO=TCP SPT=40918 DPT=9882 SEQ=3224193056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF8C930000000001030307) Feb 23 04:06:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58687 DF PROTO=TCP SPT=40918 DPT=9882 SEQ=3224193056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDF98860000000001030307) Feb 23 04:06:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21264 DF PROTO=TCP SPT=58144 DPT=9100 SEQ=1841808314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFA4060000000001030307) Feb 23 04:06:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16434 DF PROTO=TCP SPT=38864 DPT=9101 SEQ=1068486902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFB0060000000001030307) Feb 23 04:06:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23993 DF PROTO=TCP SPT=55840 DPT=9102 SEQ=1594204587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFC6100000000001030307) Feb 23 04:06:54 localhost sshd[113603]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:06:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58689 DF PROTO=TCP SPT=40918 DPT=9882 SEQ=3224193056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFC8060000000001030307) Feb 23 04:06:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23995 DF PROTO=TCP SPT=55840 DPT=9102 SEQ=1594204587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFD2060000000001030307) Feb 23 04:07:00 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Feb 23 04:07:00 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 71305 (conmon) with signal SIGKILL. Feb 23 04:07:00 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Feb 23 04:07:00 localhost systemd[1]: libpod-conmon-9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.scope: Deactivated successfully. Feb 23 04:07:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2972 DF PROTO=TCP SPT=37642 DPT=9100 SEQ=742417238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFE0860000000001030307) Feb 23 04:07:00 localhost podman[113617]: error opening file `/run/crun/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9/status`: No such file or directory Feb 23 04:07:00 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.timer: No such file or directory Feb 23 04:07:00 localhost systemd[1]: 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: Failed to open /run/systemd/transient/9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9.service: No such file or directory Feb 23 04:07:00 localhost podman[113605]: 2026-02-23 09:07:00.93436007 +0000 UTC m=+0.095711453 container cleanup 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64) Feb 23 04:07:00 localhost podman[113605]: ovn_metadata_agent Feb 23 04:07:00 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Feb 23 04:07:00 localhost systemd[1]: Stopped ovn_metadata_agent container. Feb 23 04:07:01 localhost python3.9[113710]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:07:01 localhost systemd[1]: Reloading. Feb 23 04:07:01 localhost systemd-sysv-generator[113738]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:07:01 localhost systemd-rc-local-generator[113734]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:07:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:07:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2973 DF PROTO=TCP SPT=37642 DPT=9100 SEQ=742417238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFE8860000000001030307) Feb 23 04:07:03 localhost python3.9[113840]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:04 localhost python3.9[113932]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:04 localhost python3.9[114024]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:05 localhost python3.9[114116]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:06 localhost python3.9[114208]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43779 DF PROTO=TCP SPT=34978 DPT=9101 SEQ=819520041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BDFF5870000000001030307) Feb 23 04:07:06 localhost python3.9[114300]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:07 localhost python3.9[114392]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:07 localhost python3.9[114484]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:08 localhost python3.9[114576]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:09 localhost python3.9[114668]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19751 DF PROTO=TCP SPT=60408 DPT=9882 SEQ=2142405796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE001C30000000001030307) Feb 23 04:07:09 localhost python3.9[114760]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:10 localhost sshd[114852]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:07:10 localhost python3.9[114854]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:10 localhost python3.9[114946]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:11 localhost python3.9[115038]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:12 localhost python3.9[115130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19753 DF PROTO=TCP SPT=60408 DPT=9882 SEQ=2142405796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE00DC60000000001030307) Feb 23 04:07:12 localhost python3.9[115222]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:13 localhost python3.9[115314]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:13 localhost python3.9[115406]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:14 localhost python3.9[115498]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38536 DF PROTO=TCP SPT=52248 DPT=9105 SEQ=1941285649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE018060000000001030307) Feb 23 04:07:15 localhost python3.9[115590]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:15 localhost python3.9[115682]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:16 localhost python3.9[115774]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:16 localhost python3.9[115866]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:17 localhost python3.9[115958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:18 localhost python3.9[116050]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43781 DF PROTO=TCP SPT=34978 DPT=9101 SEQ=819520041 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE026070000000001030307) Feb 23 04:07:18 localhost python3.9[116142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:19 localhost python3.9[116234]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:20 localhost python3.9[116326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:20 localhost python3.9[116418]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:21 localhost python3.9[116510]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:22 localhost python3.9[116602]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:22 localhost python3.9[116694]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:23 localhost python3.9[116786]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36203 DF PROTO=TCP SPT=35118 DPT=9102 SEQ=663757124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE03B3F0000000001030307) Feb 23 04:07:24 localhost python3.9[116878]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19755 DF PROTO=TCP SPT=60408 DPT=9882 SEQ=2142405796 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE03E060000000001030307) Feb 23 04:07:24 localhost python3.9[116970]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:25 localhost python3.9[117062]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:26 localhost python3.9[117154]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:26 localhost python3.9[117246]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36205 DF PROTO=TCP SPT=35118 DPT=9102 SEQ=663757124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE047470000000001030307) Feb 23 04:07:27 localhost python3.9[117338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:27 localhost python3.9[117430]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:28 localhost python3.9[117522]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:29 localhost python3.9[117614]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:07:30 localhost python3.9[117706]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38180 DF PROTO=TCP SPT=56580 DPT=9100 SEQ=3542714595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE055C60000000001030307) Feb 23 04:07:31 localhost python3.9[117798]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 23 04:07:31 localhost python3.9[117890]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:07:31 localhost systemd[1]: Reloading. Feb 23 04:07:31 localhost systemd-rc-local-generator[117916]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:07:31 localhost systemd-sysv-generator[117919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:07:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:07:32 localhost sshd[117953]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:07:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38181 DF PROTO=TCP SPT=56580 DPT=9100 SEQ=3542714595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE05DC60000000001030307) Feb 23 04:07:33 localhost python3.9[118019]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:33 localhost python3.9[118113]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:34 localhost python3.9[118206]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:34 localhost python3.9[118299]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:35 localhost python3.9[118392]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:36 localhost python3.9[118485]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65507 DF PROTO=TCP SPT=41786 DPT=9101 SEQ=1812879918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE06A860000000001030307) Feb 23 04:07:36 localhost python3.9[118578]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:37 localhost python3.9[118671]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:37 localhost python3.9[118764]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:38 localhost python3.9[118857]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:38 localhost python3.9[118950]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:39 localhost python3.9[119043]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3526 DF PROTO=TCP SPT=57328 DPT=9882 SEQ=1680368305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE076F30000000001030307) Feb 23 04:07:39 localhost python3.9[119166]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:40 localhost python3.9[119292]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:41 localhost python3.9[119385]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:41 localhost python3.9[119478]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3528 DF PROTO=TCP SPT=57328 DPT=9882 SEQ=1680368305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE083060000000001030307) Feb 23 04:07:42 localhost python3.9[119571]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:43 localhost python3.9[119664]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:43 localhost python3.9[119757]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:44 localhost python3.9[119865]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:44 localhost python3.9[119958]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:07:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38183 DF PROTO=TCP SPT=56580 DPT=9100 SEQ=3542714595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE08E060000000001030307) Feb 23 04:07:46 localhost systemd[1]: session-37.scope: Deactivated successfully. Feb 23 04:07:46 localhost systemd[1]: session-37.scope: Consumed 50.506s CPU time. Feb 23 04:07:46 localhost systemd-logind[759]: Session 37 logged out. Waiting for processes to exit. Feb 23 04:07:46 localhost systemd-logind[759]: Removed session 37. Feb 23 04:07:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65509 DF PROTO=TCP SPT=41786 DPT=9101 SEQ=1812879918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE09A060000000001030307) Feb 23 04:07:48 localhost sshd[119974]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:07:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17603 DF PROTO=TCP SPT=55448 DPT=9102 SEQ=3267799634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0B06F0000000001030307) Feb 23 04:07:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3530 DF PROTO=TCP SPT=57328 DPT=9882 SEQ=1680368305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0B4070000000001030307) Feb 23 04:07:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17605 DF PROTO=TCP SPT=55448 DPT=9102 SEQ=3267799634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0BC860000000001030307) Feb 23 04:08:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39098 DF PROTO=TCP SPT=56310 DPT=9100 SEQ=1508328302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0CB060000000001030307) Feb 23 04:08:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39099 DF PROTO=TCP SPT=56310 DPT=9100 SEQ=1508328302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0D3070000000001030307) Feb 23 04:08:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22098 DF PROTO=TCP SPT=46158 DPT=9101 SEQ=4033996518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0DFC70000000001030307) Feb 23 04:08:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17607 DF PROTO=TCP SPT=55448 DPT=9102 SEQ=3267799634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0EC060000000001030307) Feb 23 04:08:12 localhost sshd[119976]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:08:12 localhost systemd-logind[759]: New session 38 of user zuul. Feb 23 04:08:12 localhost systemd[1]: Started Session 38 of User zuul. Feb 23 04:08:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22274 DF PROTO=TCP SPT=45906 DPT=9882 SEQ=817962013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE0F8460000000001030307) Feb 23 04:08:12 localhost python3.9[120069]: ansible-ansible.legacy.ping Invoked with data=pong Feb 23 04:08:13 localhost python3.9[120173]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:08:14 localhost python3.9[120265]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:08:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58633 DF PROTO=TCP SPT=55134 DPT=9105 SEQ=2418348777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE102060000000001030307) Feb 23 04:08:15 localhost sshd[120281]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:08:16 localhost python3.9[120360]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:08:16 localhost python3.9[120452]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:08:17 localhost python3.9[120544]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:08:18 localhost python3.9[120617]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771837697.1066446-172-217122759841424/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:08:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22100 DF PROTO=TCP SPT=46158 DPT=9101 SEQ=4033996518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE110060000000001030307) Feb 23 04:08:19 localhost python3.9[120709]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:08:20 localhost python3.9[120805]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:08:20 localhost python3.9[120897]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:08:21 localhost python3.9[120987]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:08:21 localhost network[121004]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:08:21 localhost network[121005]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:08:21 localhost network[121006]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:08:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:08:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46956 DF PROTO=TCP SPT=33590 DPT=9102 SEQ=1229274110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE125A00000000001030307) Feb 23 04:08:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22276 DF PROTO=TCP SPT=45906 DPT=9882 SEQ=817962013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE128060000000001030307) Feb 23 04:08:25 localhost python3.9[121203]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:08:26 localhost sshd[121294]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:08:26 localhost python3.9[121293]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:08:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46958 DF PROTO=TCP SPT=33590 DPT=9102 SEQ=1229274110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE131C70000000001030307) Feb 23 04:08:27 localhost python3.9[121391]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:08:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5322 DF PROTO=TCP SPT=52164 DPT=9100 SEQ=733606790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE140060000000001030307) Feb 23 04:08:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5323 DF PROTO=TCP SPT=52164 DPT=9100 SEQ=733606790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE148060000000001030307) Feb 23 04:08:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21862 DF PROTO=TCP SPT=43206 DPT=9101 SEQ=3425392840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE155060000000001030307) Feb 23 04:08:37 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 23 04:08:37 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 23 04:08:37 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 23 04:08:37 localhost systemd[1]: sshd.service: Consumed 13.222s CPU time. Feb 23 04:08:37 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 23 04:08:37 localhost systemd[1]: Stopping sshd-keygen.target... Feb 23 04:08:37 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:08:37 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:08:37 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:08:37 localhost systemd[1]: Reached target sshd-keygen.target. Feb 23 04:08:37 localhost systemd[1]: Starting OpenSSH server daemon... Feb 23 04:08:37 localhost sshd[121434]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:08:37 localhost systemd[1]: Started OpenSSH server daemon. Feb 23 04:08:37 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 04:08:37 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 04:08:37 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 04:08:38 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 04:08:38 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 04:08:38 localhost systemd[1]: run-r565896e9ac224709961ce98512c68dcf.service: Deactivated successfully. Feb 23 04:08:38 localhost systemd[1]: run-r7bf820a03f9a4132b58b917aec5b33a4.service: Deactivated successfully. Feb 23 04:08:38 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 23 04:08:38 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 23 04:08:38 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 23 04:08:38 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 23 04:08:38 localhost systemd[1]: Stopping sshd-keygen.target... Feb 23 04:08:38 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:08:38 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:08:38 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:08:38 localhost systemd[1]: Reached target sshd-keygen.target. Feb 23 04:08:38 localhost systemd[1]: Starting OpenSSH server daemon... Feb 23 04:08:38 localhost sshd[122114]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:08:38 localhost systemd[1]: Started OpenSSH server daemon. Feb 23 04:08:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37809 DF PROTO=TCP SPT=37182 DPT=9882 SEQ=1659102619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE161560000000001030307) Feb 23 04:08:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37811 DF PROTO=TCP SPT=37182 DPT=9882 SEQ=1659102619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE16D470000000001030307) Feb 23 04:08:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5325 DF PROTO=TCP SPT=52164 DPT=9100 SEQ=733606790 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE178060000000001030307) Feb 23 04:08:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21864 DF PROTO=TCP SPT=43206 DPT=9101 SEQ=3425392840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE186060000000001030307) Feb 23 04:08:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42056 DF PROTO=TCP SPT=58116 DPT=9102 SEQ=1398528663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE19AD00000000001030307) Feb 23 04:08:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37813 DF PROTO=TCP SPT=37182 DPT=9882 SEQ=1659102619 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE19E070000000001030307) Feb 23 04:08:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42058 DF PROTO=TCP SPT=58116 DPT=9102 SEQ=1398528663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1A6C60000000001030307) Feb 23 04:08:57 localhost sshd[122339]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:09:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23207 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=3219387706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1B5470000000001030307) Feb 23 04:09:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:09:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 04:09:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23208 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=3219387706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1BD460000000001030307) Feb 23 04:09:04 localhost sshd[122357]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:09:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7286 DF PROTO=TCP SPT=37554 DPT=9101 SEQ=2171848523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1CA470000000001030307) Feb 23 04:09:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:09:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 04:09:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42060 DF PROTO=TCP SPT=58116 DPT=9102 SEQ=1398528663 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1D6060000000001030307) Feb 23 04:09:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34131 DF PROTO=TCP SPT=43926 DPT=9882 SEQ=210262696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1E2870000000001030307) Feb 23 04:09:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23210 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=3219387706 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1EE060000000001030307) Feb 23 04:09:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7288 DF PROTO=TCP SPT=37554 DPT=9101 SEQ=2171848523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE1FA070000000001030307) Feb 23 04:09:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17243 DF PROTO=TCP SPT=48718 DPT=9102 SEQ=1691152045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE210000000000001030307) Feb 23 04:09:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34133 DF PROTO=TCP SPT=43926 DPT=9882 SEQ=210262696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE212060000000001030307) Feb 23 04:09:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17245 DF PROTO=TCP SPT=48718 DPT=9102 SEQ=1691152045 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE21C060000000001030307) Feb 23 04:09:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11673 DF PROTO=TCP SPT=60216 DPT=9100 SEQ=2551074236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE22A860000000001030307) Feb 23 04:09:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11674 DF PROTO=TCP SPT=60216 DPT=9100 SEQ=2551074236 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE232870000000001030307) Feb 23 04:09:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45617 DF PROTO=TCP SPT=47056 DPT=9101 SEQ=2913212424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE23F470000000001030307) Feb 23 04:09:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28896 DF PROTO=TCP SPT=53872 DPT=9882 SEQ=927108265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE24BB30000000001030307) Feb 23 04:09:41 localhost sshd[122580]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:09:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28898 DF PROTO=TCP SPT=53872 DPT=9882 SEQ=927108265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE257C60000000001030307) Feb 23 04:09:43 localhost sshd[122589]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:09:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17225 DF PROTO=TCP SPT=46882 DPT=9105 SEQ=1727784005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE262060000000001030307) Feb 23 04:09:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45619 DF PROTO=TCP SPT=47056 DPT=9101 SEQ=2913212424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE270060000000001030307) Feb 23 04:09:50 localhost podman[122746]: Feb 23 04:09:50 localhost podman[122746]: 2026-02-23 09:09:50.782506241 +0000 UTC m=+0.080340562 container create 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1770267347, maintainer=Guillaume Abrioux , ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=) Feb 23 04:09:50 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=15 res=1 Feb 23 04:09:50 localhost systemd[1]: Started libpod-conmon-73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056.scope. Feb 23 04:09:50 localhost podman[122746]: 2026-02-23 09:09:50.750025796 +0000 UTC m=+0.047860137 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:09:50 localhost systemd[1]: Started libcrun container. Feb 23 04:09:50 localhost podman[122746]: 2026-02-23 09:09:50.86771132 +0000 UTC m=+0.165545641 container init 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main) Feb 23 04:09:50 localhost podman[122746]: 2026-02-23 09:09:50.878286774 +0000 UTC m=+0.176121065 container start 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2026-02-09T10:25:24Z, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , distribution-scope=public, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:09:50 localhost podman[122746]: 2026-02-23 09:09:50.878632815 +0000 UTC m=+0.176467136 container attach 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:09:50 localhost priceless_ptolemy[122761]: 167 167 Feb 23 04:09:50 localhost systemd[1]: libpod-73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056.scope: Deactivated successfully. Feb 23 04:09:50 localhost podman[122746]: 2026-02-23 09:09:50.883955048 +0000 UTC m=+0.181789389 container died 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, maintainer=Guillaume Abrioux , name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, release=1770267347) Feb 23 04:09:50 localhost podman[122767]: 2026-02-23 09:09:50.981168466 +0000 UTC m=+0.082152517 container remove 73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_ptolemy, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph) Feb 23 04:09:50 localhost systemd[1]: libpod-conmon-73570b630d4d3ac82fd7c20c5053de8abe9ee63d056ea907e28575209beed056.scope: Deactivated successfully. Feb 23 04:09:51 localhost podman[122789]: Feb 23 04:09:51 localhost podman[122789]: 2026-02-23 09:09:51.207196449 +0000 UTC m=+0.070952745 container create 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, maintainer=Guillaume Abrioux , release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main) Feb 23 04:09:51 localhost systemd[1]: Started libpod-conmon-3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013.scope. Feb 23 04:09:51 localhost podman[122789]: 2026-02-23 09:09:51.164309504 +0000 UTC m=+0.028065830 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:09:51 localhost systemd[1]: Started libcrun container. Feb 23 04:09:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f78aea2c8e70a7ac0cbc19bd4a96fb4d1b27dd765b2224b3fd1e2e1d95dd9f/merged/rootfs supports timestamps until 2038 (0x7fffffff) Feb 23 04:09:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f78aea2c8e70a7ac0cbc19bd4a96fb4d1b27dd765b2224b3fd1e2e1d95dd9f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 04:09:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f78aea2c8e70a7ac0cbc19bd4a96fb4d1b27dd765b2224b3fd1e2e1d95dd9f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 04:09:51 localhost podman[122789]: 2026-02-23 09:09:51.27648362 +0000 UTC m=+0.140239906 container init 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, RELEASE=main, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, ceph=True, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:09:51 localhost podman[122789]: 2026-02-23 09:09:51.290780328 +0000 UTC m=+0.154536614 container start 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=) Feb 23 04:09:51 localhost podman[122789]: 2026-02-23 09:09:51.29115519 +0000 UTC m=+0.154911476 container attach 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True) Feb 23 04:09:51 localhost systemd[1]: var-lib-containers-storage-overlay-be90ae7fae200ac1f3bd074b36960bb9ba70b1e78bbd96c6196b1bd2f718c3f9-merged.mount: Deactivated successfully. Feb 23 04:09:52 localhost eloquent_sutherland[122806]: [ Feb 23 04:09:52 localhost eloquent_sutherland[122806]: { Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "available": false, Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "ceph_device": false, Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "device_id": "QEMU_DVD-ROM_QM00001", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "lsm_data": {}, Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "lvs": [], Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "path": "/dev/sr0", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "rejected_reasons": [ Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "Insufficient space (<5GB)", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "Has a FileSystem" Feb 23 04:09:52 localhost eloquent_sutherland[122806]: ], Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "sys_api": { Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "actuators": null, Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "device_nodes": "sr0", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "human_readable_size": "482.00 KB", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "id_bus": "ata", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "model": "QEMU DVD-ROM", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "nr_requests": "2", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "partitions": {}, Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "path": "/dev/sr0", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "removable": "1", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "rev": "2.5+", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "ro": "0", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "rotational": "1", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "sas_address": "", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "sas_device_handle": "", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "scheduler_mode": "mq-deadline", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "sectors": 0, Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "sectorsize": "2048", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "size": 493568.0, Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "support_discard": "0", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "type": "disk", Feb 23 04:09:52 localhost eloquent_sutherland[122806]: "vendor": "QEMU" Feb 23 04:09:52 localhost eloquent_sutherland[122806]: } Feb 23 04:09:52 localhost eloquent_sutherland[122806]: } Feb 23 04:09:52 localhost eloquent_sutherland[122806]: ] Feb 23 04:09:52 localhost systemd[1]: libpod-3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013.scope: Deactivated successfully. Feb 23 04:09:52 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=16 res=1 Feb 23 04:09:52 localhost podman[124397]: 2026-02-23 09:09:52.240448705 +0000 UTC m=+0.032409693 container died 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, version=7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git) Feb 23 04:09:52 localhost systemd[1]: var-lib-containers-storage-overlay-64f78aea2c8e70a7ac0cbc19bd4a96fb4d1b27dd765b2224b3fd1e2e1d95dd9f-merged.mount: Deactivated successfully. Feb 23 04:09:52 localhost podman[124397]: 2026-02-23 09:09:52.283216955 +0000 UTC m=+0.075177963 container remove 3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_sutherland, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, version=7, ceph=True, architecture=x86_64, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, release=1770267347, GIT_CLEAN=True, io.buildah.version=1.42.2, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:09:52 localhost systemd[1]: libpod-conmon-3990832ba815c78eb74db3f29d9757eea04043cb6180ec5e8034d7f78fa6e013.scope: Deactivated successfully. Feb 23 04:09:52 localhost kernel: SELinux: Converting 2754 SID table entries... Feb 23 04:09:52 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 04:09:52 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 04:09:52 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 04:09:52 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 04:09:52 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 04:09:52 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 04:09:52 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 04:09:53 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=17 res=1 Feb 23 04:09:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34404 DF PROTO=TCP SPT=40142 DPT=9102 SEQ=420552745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE285300000000001030307) Feb 23 04:09:54 localhost python3.9[124613]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:09:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28900 DF PROTO=TCP SPT=53872 DPT=9882 SEQ=927108265 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE288070000000001030307) Feb 23 04:09:55 localhost python3.9[124705]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:09:56 localhost python3.9[124778]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771837794.9916973-421-240879572633461/.source.fact _original_basename=.ytiar93n follow=False checksum=d686dccd4d8cd0883f3e3bc0a6f664c73290ba68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:09:57 localhost python3.9[124868]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:09:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34406 DF PROTO=TCP SPT=40142 DPT=9102 SEQ=420552745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE291460000000001030307) Feb 23 04:09:58 localhost python3.9[124966]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:09:59 localhost python3.9[125020]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:10:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54588 DF PROTO=TCP SPT=44146 DPT=9100 SEQ=1842244641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE29FC60000000001030307) Feb 23 04:10:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54589 DF PROTO=TCP SPT=44146 DPT=9100 SEQ=1842244641 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2A7C60000000001030307) Feb 23 04:10:03 localhost systemd[1]: Reloading. Feb 23 04:10:03 localhost systemd-rc-local-generator[125059]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:10:03 localhost systemd-sysv-generator[125063]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:10:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:10:03 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 04:10:04 localhost python3.9[125161]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:10:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58805 DF PROTO=TCP SPT=45086 DPT=9101 SEQ=608469731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2B4860000000001030307) Feb 23 04:10:06 localhost python3.9[125400]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Feb 23 04:10:07 localhost python3.9[125492]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Feb 23 04:10:08 localhost python3.9[125585]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:10:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46372 DF PROTO=TCP SPT=52514 DPT=9882 SEQ=4025938738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2C0E20000000001030307) Feb 23 04:10:09 localhost python3.9[125677]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Feb 23 04:10:11 localhost python3.9[125769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:10:12 localhost python3.9[125861]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:10:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46374 DF PROTO=TCP SPT=52514 DPT=9882 SEQ=4025938738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2CD070000000001030307) Feb 23 04:10:12 localhost python3.9[125934]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771837811.5468314-745-13721008793683/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:10:13 localhost python3.9[126026]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:10:15 localhost python3.9[126120]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Feb 23 04:10:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10160 DF PROTO=TCP SPT=52164 DPT=9105 SEQ=1394534134 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2D8060000000001030307) Feb 23 04:10:15 localhost python3.9[126213]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Feb 23 04:10:16 localhost python3.9[126306]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 23 04:10:17 localhost python3.9[126404]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Feb 23 04:10:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58807 DF PROTO=TCP SPT=45086 DPT=9101 SEQ=608469731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2E4060000000001030307) Feb 23 04:10:18 localhost python3.9[126496]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:10:22 localhost python3.9[126590]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:10:23 localhost sshd[126605]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:10:23 localhost sshd[126607]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:10:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6766 DF PROTO=TCP SPT=47330 DPT=9102 SEQ=4218432348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2FA600000000001030307) Feb 23 04:10:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46376 DF PROTO=TCP SPT=52514 DPT=9882 SEQ=4025938738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE2FE070000000001030307) Feb 23 04:10:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6768 DF PROTO=TCP SPT=47330 DPT=9102 SEQ=4218432348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE306870000000001030307) Feb 23 04:10:27 localhost python3.9[126686]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:10:28 localhost python3.9[126759]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771837827.3949254-1018-196136338400691/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 04:10:29 localhost python3.9[126851]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:10:29 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 23 04:10:29 localhost systemd[1]: Stopped Load Kernel Modules. Feb 23 04:10:29 localhost systemd[1]: Stopping Load Kernel Modules... Feb 23 04:10:29 localhost systemd[1]: Starting Load Kernel Modules... Feb 23 04:10:29 localhost systemd-modules-load[126855]: Module 'msr' is built in Feb 23 04:10:29 localhost systemd[1]: Finished Load Kernel Modules. Feb 23 04:10:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27787 DF PROTO=TCP SPT=37474 DPT=9100 SEQ=3323489684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE314C70000000001030307) Feb 23 04:10:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27788 DF PROTO=TCP SPT=37474 DPT=9100 SEQ=3323489684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE31CC60000000001030307) Feb 23 04:10:33 localhost python3.9[126947]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:10:34 localhost python3.9[127020]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771837833.33873-1087-220998321439589/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 04:10:35 localhost python3.9[127112]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:10:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29067 DF PROTO=TCP SPT=41922 DPT=9101 SEQ=3475271719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE329C60000000001030307) Feb 23 04:10:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6770 DF PROTO=TCP SPT=47330 DPT=9102 SEQ=4218432348 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE336070000000001030307) Feb 23 04:10:39 localhost python3.9[127204]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:10:40 localhost python3.9[127296]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Feb 23 04:10:40 localhost python3.9[127386]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:10:42 localhost python3.9[127478]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:10:42 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Feb 23 04:10:42 localhost systemd[1]: tuned.service: Deactivated successfully. Feb 23 04:10:42 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Feb 23 04:10:42 localhost systemd[1]: tuned.service: Consumed 2.026s CPU time, no IO. Feb 23 04:10:42 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Feb 23 04:10:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62012 DF PROTO=TCP SPT=39948 DPT=9882 SEQ=889338140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE342060000000001030307) Feb 23 04:10:43 localhost systemd[1]: Started Dynamic System Tuning Daemon. Feb 23 04:10:44 localhost python3.9[127580]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Feb 23 04:10:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27790 DF PROTO=TCP SPT=37474 DPT=9100 SEQ=3323489684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE34C060000000001030307) Feb 23 04:10:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29069 DF PROTO=TCP SPT=41922 DPT=9101 SEQ=3475271719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE35A070000000001030307) Feb 23 04:10:51 localhost python3.9[127672]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:10:51 localhost systemd[1]: Reloading. Feb 23 04:10:52 localhost systemd-rc-local-generator[127701]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:10:52 localhost systemd-sysv-generator[127705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:10:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:10:52 localhost python3.9[127802]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:10:53 localhost systemd[1]: Reloading. Feb 23 04:10:54 localhost systemd-rc-local-generator[127877]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:10:54 localhost systemd-sysv-generator[127880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:10:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33995 DF PROTO=TCP SPT=44786 DPT=9102 SEQ=2922352443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE36F900000000001030307) Feb 23 04:10:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:10:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62014 DF PROTO=TCP SPT=39948 DPT=9882 SEQ=889338140 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE372070000000001030307) Feb 23 04:10:55 localhost python3.9[127991]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:10:55 localhost python3.9[128084]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:10:55 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Feb 23 04:10:56 localhost python3.9[128177]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:10:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33997 DF PROTO=TCP SPT=44786 DPT=9102 SEQ=2922352443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE37B860000000001030307) Feb 23 04:10:58 localhost python3.9[128291]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:10:59 localhost python3.9[128384]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:10:59 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 23 04:10:59 localhost systemd[1]: Stopped Apply Kernel Variables. Feb 23 04:10:59 localhost systemd[1]: Stopping Apply Kernel Variables... Feb 23 04:10:59 localhost systemd[1]: Starting Apply Kernel Variables... Feb 23 04:10:59 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 23 04:10:59 localhost systemd[1]: Finished Apply Kernel Variables. Feb 23 04:11:00 localhost systemd[1]: session-38.scope: Deactivated successfully. Feb 23 04:11:00 localhost systemd[1]: session-38.scope: Consumed 2min 2.658s CPU time. Feb 23 04:11:00 localhost systemd-logind[759]: Session 38 logged out. Waiting for processes to exit. Feb 23 04:11:00 localhost systemd-logind[759]: Removed session 38. Feb 23 04:11:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45449 DF PROTO=TCP SPT=36540 DPT=9100 SEQ=287336381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE38A060000000001030307) Feb 23 04:11:02 localhost sshd[128404]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:11:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45450 DF PROTO=TCP SPT=36540 DPT=9100 SEQ=287336381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE392060000000001030307) Feb 23 04:11:04 localhost sshd[128406]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:11:05 localhost systemd-logind[759]: New session 39 of user zuul. Feb 23 04:11:05 localhost systemd[1]: Started Session 39 of User zuul. Feb 23 04:11:06 localhost python3.9[128499]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:11:06 localhost sshd[128504]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:11:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28357 DF PROTO=TCP SPT=59532 DPT=9101 SEQ=1427864577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE39F060000000001030307) Feb 23 04:11:07 localhost python3.9[128594]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:11:08 localhost python3.9[128690]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:11:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20817 DF PROTO=TCP SPT=46794 DPT=9882 SEQ=2409356902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3AB430000000001030307) Feb 23 04:11:09 localhost python3.9[128781]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:11:10 localhost python3.9[128877]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:11:11 localhost python3.9[128931]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:11:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20819 DF PROTO=TCP SPT=46794 DPT=9882 SEQ=2409356902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3B7460000000001030307) Feb 23 04:11:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45452 DF PROTO=TCP SPT=36540 DPT=9100 SEQ=287336381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3C2060000000001030307) Feb 23 04:11:15 localhost python3.9[129026]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:11:17 localhost python3.9[129181]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:11:17 localhost python3.9[129273]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:11:18 localhost python3.9[129377]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:11:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28359 DF PROTO=TCP SPT=59532 DPT=9101 SEQ=1427864577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3D0060000000001030307) Feb 23 04:11:18 localhost python3.9[129425]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:11:19 localhost python3.9[129517]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:11:20 localhost python3.9[129590]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771837879.155516-318-267515019269428/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 04:11:21 localhost python3.9[129682]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 23 04:11:21 localhost systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Feb 23 04:11:21 localhost systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 04:11:21 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:11:21 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:11:21 localhost python3.9[129775]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 23 04:11:22 localhost python3.9[129867]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 23 04:11:23 localhost python3.9[129959]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 23 04:11:23 localhost python3.9[130049]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:11:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20149 DF PROTO=TCP SPT=36620 DPT=9102 SEQ=1249038669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3E4C10000000001030307) Feb 23 04:11:24 localhost python3.9[130143]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:11:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20821 DF PROTO=TCP SPT=46794 DPT=9882 SEQ=2409356902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3E8070000000001030307) Feb 23 04:11:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20151 DF PROTO=TCP SPT=36620 DPT=9102 SEQ=1249038669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3F0C60000000001030307) Feb 23 04:11:28 localhost python3.9[130237]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:11:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2677 DF PROTO=TCP SPT=59320 DPT=9100 SEQ=600894686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE3FF460000000001030307) Feb 23 04:11:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2678 DF PROTO=TCP SPT=59320 DPT=9100 SEQ=600894686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE407460000000001030307) Feb 23 04:11:33 localhost python3.9[130331]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:11:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32817 DF PROTO=TCP SPT=45238 DPT=9101 SEQ=2992531929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE414060000000001030307) Feb 23 04:11:37 localhost python3.9[130431]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:11:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20153 DF PROTO=TCP SPT=36620 DPT=9102 SEQ=1249038669 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE420070000000001030307) Feb 23 04:11:41 localhost python3.9[130525]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:11:42 localhost sshd[130528]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:11:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38399 DF PROTO=TCP SPT=58728 DPT=9882 SEQ=903369776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE42C870000000001030307) Feb 23 04:11:43 localhost sshd[130530]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:11:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27552 DF PROTO=TCP SPT=41184 DPT=9105 SEQ=3035751851 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE438060000000001030307) Feb 23 04:11:46 localhost python3.9[130623]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:11:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32819 DF PROTO=TCP SPT=45238 DPT=9101 SEQ=2992531929 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE444060000000001030307) Feb 23 04:11:50 localhost python3.9[130717]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:11:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6599 DF PROTO=TCP SPT=52348 DPT=9102 SEQ=3188560332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE459F00000000001030307) Feb 23 04:11:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38401 DF PROTO=TCP SPT=58728 DPT=9882 SEQ=903369776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE45C060000000001030307) Feb 23 04:11:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6601 DF PROTO=TCP SPT=52348 DPT=9102 SEQ=3188560332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE466070000000001030307) Feb 23 04:12:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52695 DF PROTO=TCP SPT=54462 DPT=9100 SEQ=1675174803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE474870000000001030307) Feb 23 04:12:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52696 DF PROTO=TCP SPT=54462 DPT=9100 SEQ=1675174803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE47C860000000001030307) Feb 23 04:12:03 localhost python3.9[131014]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:12:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30765 DF PROTO=TCP SPT=37558 DPT=9101 SEQ=67614132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE489460000000001030307) Feb 23 04:12:07 localhost python3.9[131109]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:12:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32703 DF PROTO=TCP SPT=55290 DPT=9882 SEQ=3556716332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE495A30000000001030307) Feb 23 04:12:11 localhost python3.9[131207]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:12:12 localhost python3.9[131312]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:12:12 localhost auditd[725]: Audit daemon rotating log files Feb 23 04:12:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32705 DF PROTO=TCP SPT=55290 DPT=9882 SEQ=3556716332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4A1C60000000001030307) Feb 23 04:12:12 localhost python3.9[131385]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771837931.9345686-771-41388446962518/.source.json _original_basename=.dlqjd6bq follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:12:13 localhost python3.9[131477]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 23 04:12:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52698 DF PROTO=TCP SPT=54462 DPT=9100 SEQ=1675174803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4AC060000000001030307) Feb 23 04:12:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30767 DF PROTO=TCP SPT=37558 DPT=9101 SEQ=67614132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4BA070000000001030307) Feb 23 04:12:20 localhost podman[131491]: 2026-02-23 09:12:14.063904368 +0000 UTC m=+0.042549986 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 23 04:12:21 localhost python3.9[131692]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 23 04:12:21 localhost sshd[131719]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:12:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59992 DF PROTO=TCP SPT=57914 DPT=9102 SEQ=780876611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4CF1F0000000001030307) Feb 23 04:12:24 localhost sshd[131734]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:12:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32707 DF PROTO=TCP SPT=55290 DPT=9882 SEQ=3556716332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4D2060000000001030307) Feb 23 04:12:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59994 DF PROTO=TCP SPT=57914 DPT=9102 SEQ=780876611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4DB460000000001030307) Feb 23 04:12:29 localhost podman[131706]: 2026-02-23 09:12:21.579995949 +0000 UTC m=+0.042864176 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 23 04:12:30 localhost python3.9[131914]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 23 04:12:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12593 DF PROTO=TCP SPT=58046 DPT=9100 SEQ=385021854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4E9860000000001030307) Feb 23 04:12:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12594 DF PROTO=TCP SPT=58046 DPT=9100 SEQ=385021854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4F1860000000001030307) Feb 23 04:12:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=499 DF PROTO=TCP SPT=57266 DPT=9101 SEQ=434442640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE4FE870000000001030307) Feb 23 04:12:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63204 DF PROTO=TCP SPT=40094 DPT=9882 SEQ=3854489123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE50AD30000000001030307) Feb 23 04:12:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63206 DF PROTO=TCP SPT=40094 DPT=9882 SEQ=3854489123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE516C60000000001030307) Feb 23 04:12:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14313 DF PROTO=TCP SPT=47896 DPT=9105 SEQ=588704374 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE522060000000001030307) Feb 23 04:12:47 localhost podman[131928]: 2026-02-23 09:12:30.822706318 +0000 UTC m=+0.048114221 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 23 04:12:48 localhost python3.9[132611]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 23 04:12:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=501 DF PROTO=TCP SPT=57266 DPT=9101 SEQ=434442640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE52E060000000001030307) Feb 23 04:12:49 localhost podman[132625]: 2026-02-23 09:12:48.414726034 +0000 UTC m=+0.031213131 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 23 04:12:51 localhost python3.9[132785]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 23 04:12:52 localhost podman[132797]: 2026-02-23 09:12:51.257906628 +0000 UTC m=+0.044995313 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:12:53 localhost python3.9[132964]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 23 04:12:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9095 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=4131939506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE544500000000001030307) Feb 23 04:12:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63208 DF PROTO=TCP SPT=40094 DPT=9882 SEQ=3854489123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE546070000000001030307) Feb 23 04:12:57 localhost podman[132977]: 2026-02-23 09:12:53.540990302 +0000 UTC m=+0.051730295 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Feb 23 04:12:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9097 DF PROTO=TCP SPT=51194 DPT=9102 SEQ=4131939506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE550460000000001030307) Feb 23 04:12:58 localhost python3.9[133155]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Feb 23 04:12:59 localhost podman[133167]: 2026-02-23 09:12:58.377557657 +0000 UTC m=+0.052666465 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Feb 23 04:13:00 localhost systemd-logind[759]: Session 39 logged out. Waiting for processes to exit. Feb 23 04:13:00 localhost systemd[1]: session-39.scope: Deactivated successfully. Feb 23 04:13:00 localhost systemd[1]: session-39.scope: Consumed 2min 7.471s CPU time. Feb 23 04:13:00 localhost systemd-logind[759]: Removed session 39. Feb 23 04:13:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37093 DF PROTO=TCP SPT=50740 DPT=9100 SEQ=1221982567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE55EC60000000001030307) Feb 23 04:13:01 localhost sshd[133352]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:13:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37094 DF PROTO=TCP SPT=50740 DPT=9100 SEQ=1221982567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE566C70000000001030307) Feb 23 04:13:04 localhost sshd[133354]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:13:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42667 DF PROTO=TCP SPT=53756 DPT=9101 SEQ=1338459337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE573C60000000001030307) Feb 23 04:13:06 localhost sshd[133356]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:13:06 localhost systemd-logind[759]: New session 40 of user zuul. Feb 23 04:13:06 localhost systemd[1]: Started Session 40 of User zuul. Feb 23 04:13:07 localhost python3.9[133449]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:13:09 localhost python3.9[133546]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Feb 23 04:13:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12366 DF PROTO=TCP SPT=58490 DPT=9882 SEQ=1431760514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE580050000000001030307) Feb 23 04:13:10 localhost python3.9[133639]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:13:11 localhost python3.9[133693]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:13:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12368 DF PROTO=TCP SPT=58490 DPT=9882 SEQ=1431760514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE58C060000000001030307) Feb 23 04:13:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55167 DF PROTO=TCP SPT=54562 DPT=9105 SEQ=2723027043 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE596060000000001030307) Feb 23 04:13:16 localhost python3.9[133787]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:13:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42669 DF PROTO=TCP SPT=53756 DPT=9101 SEQ=1338459337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5A4060000000001030307) Feb 23 04:13:22 localhost python3.9[133975]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:13:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7876 DF PROTO=TCP SPT=47194 DPT=9102 SEQ=4261620880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5B97F0000000001030307) Feb 23 04:13:24 localhost python3.9[134068]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:13:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12370 DF PROTO=TCP SPT=58490 DPT=9882 SEQ=1431760514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5BC070000000001030307) Feb 23 04:13:25 localhost python3.9[134160]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Feb 23 04:13:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7878 DF PROTO=TCP SPT=47194 DPT=9102 SEQ=4261620880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5C5870000000001030307) Feb 23 04:13:27 localhost kernel: SELinux: Converting 2756 SID table entries... Feb 23 04:13:27 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 04:13:27 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 04:13:27 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 04:13:27 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 04:13:27 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 04:13:27 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 04:13:27 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 04:13:28 localhost python3.9[134466]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:13:29 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=18 res=1 Feb 23 04:13:29 localhost python3.9[134564]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:13:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54570 DF PROTO=TCP SPT=36690 DPT=9100 SEQ=2296867033 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5D4060000000001030307) Feb 23 04:13:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54571 DF PROTO=TCP SPT=36690 DPT=9100 SEQ=2296867033 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5DC070000000001030307) Feb 23 04:13:33 localhost python3.9[134658]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:13:35 localhost python3.9[134903]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None Feb 23 04:13:36 localhost python3.9[134993]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:13:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11740 DF PROTO=TCP SPT=35688 DPT=9101 SEQ=736404783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5E8C60000000001030307) Feb 23 04:13:36 localhost python3.9[135087]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:13:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41875 DF PROTO=TCP SPT=45246 DPT=9882 SEQ=1921502582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE5F5330000000001030307) Feb 23 04:13:41 localhost python3.9[135181]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:13:41 localhost sshd[135184]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:13:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41877 DF PROTO=TCP SPT=45246 DPT=9882 SEQ=1921502582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE601470000000001030307) Feb 23 04:13:44 localhost sshd[135200]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:13:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63096 DF PROTO=TCP SPT=49386 DPT=9105 SEQ=841001769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE60C060000000001030307) Feb 23 04:13:45 localhost python3.9[135279]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 23 04:13:46 localhost systemd[1]: Reloading. Feb 23 04:13:46 localhost systemd-rc-local-generator[135305]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:13:46 localhost systemd-sysv-generator[135308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:13:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:13:47 localhost python3.9[135411]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:13:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11742 DF PROTO=TCP SPT=35688 DPT=9101 SEQ=736404783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE618060000000001030307) Feb 23 04:13:48 localhost python3.9[135503]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:13:49 localhost python3.9[135597]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:13:49 localhost python3.9[135689]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:13:50 localhost python3.9[135781]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:13:51 localhost python3.9[135854]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838030.0737226-564-200397121333463/.source _original_basename=.rffe3863 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:13:52 localhost python3.9[135946]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:13:53 localhost python3.9[136038]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Feb 23 04:13:53 localhost python3.9[136130]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:13:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36092 DF PROTO=TCP SPT=37026 DPT=9102 SEQ=2924876512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE62EB00000000001030307) Feb 23 04:13:54 localhost python3.9[136222]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:13:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41879 DF PROTO=TCP SPT=45246 DPT=9882 SEQ=1921502582 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE632060000000001030307) Feb 23 04:13:55 localhost python3.9[136295]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838034.1024196-690-273551418196324/.source.yaml _original_basename=.iim40c_8 follow=False checksum=0cadac3cfc033a4e07cfac59b43f6459e787700a force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:13:55 localhost python3.9[136387]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Feb 23 04:13:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36094 DF PROTO=TCP SPT=37026 DPT=9102 SEQ=2924876512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE63AC60000000001030307) Feb 23 04:13:57 localhost ansible-async_wrapper.py[136492]: Invoked with j242000459938 300 /home/zuul/.ansible/tmp/ansible-tmp-1771838036.3497324-762-23743892687750/AnsiballZ_edpm_os_net_config.py _ Feb 23 04:13:57 localhost ansible-async_wrapper.py[136495]: Starting module and watcher Feb 23 04:13:57 localhost ansible-async_wrapper.py[136495]: Start watching 136496 (300) Feb 23 04:13:57 localhost ansible-async_wrapper.py[136496]: Start module (136496) Feb 23 04:13:57 localhost ansible-async_wrapper.py[136492]: Return async_wrapper task started. Feb 23 04:13:57 localhost python3.9[136497]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=False purge_provider= Feb 23 04:13:58 localhost ansible-async_wrapper.py[136496]: Module complete (136496) Feb 23 04:14:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9324 DF PROTO=TCP SPT=40138 DPT=9100 SEQ=2835335867 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE649460000000001030307) Feb 23 04:14:02 localhost ansible-async_wrapper.py[136495]: Done in kid B. Feb 23 04:14:02 localhost podman[136658]: 2026-02-23 09:14:02.232297075 +0000 UTC m=+0.083676888 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:14:02 localhost podman[136658]: 2026-02-23 09:14:02.328560466 +0000 UTC m=+0.179940289 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux , distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.42.2, architecture=x86_64, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7) Feb 23 04:14:02 localhost python3.9[136762]: ansible-ansible.legacy.async_status Invoked with jid=j242000459938.136492 mode=status _async_dir=/root/.ansible_async Feb 23 04:14:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9325 DF PROTO=TCP SPT=40138 DPT=9100 SEQ=2835335867 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE651460000000001030307) Feb 23 04:14:03 localhost python3.9[136857]: ansible-ansible.legacy.async_status Invoked with jid=j242000459938.136492 mode=cleanup _async_dir=/root/.ansible_async Feb 23 04:14:03 localhost python3.9[136980]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:14:04 localhost python3.9[137068]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838043.4010186-828-258624170803688/.source.returncode _original_basename=.8p_ff5od follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:14:05 localhost python3.9[137160]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:14:05 localhost python3.9[137233]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838044.6354992-876-64034903805338/.source.cfg _original_basename=.xgh9r2sy follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:14:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19951 DF PROTO=TCP SPT=41204 DPT=9101 SEQ=1234346066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE65E070000000001030307) Feb 23 04:14:06 localhost python3.9[137325]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:14:06 localhost systemd[1]: Reloading Network Manager... Feb 23 04:14:06 localhost NetworkManager[5974]: [1771838046.5329] audit: op="reload" arg="0" pid=137329 uid=0 result="success" Feb 23 04:14:06 localhost NetworkManager[5974]: [1771838046.5337] config: signal: SIGHUP (no changes from disk) Feb 23 04:14:06 localhost systemd[1]: Reloaded Network Manager. Feb 23 04:14:07 localhost systemd[1]: session-40.scope: Deactivated successfully. Feb 23 04:14:07 localhost systemd[1]: session-40.scope: Consumed 38.299s CPU time. Feb 23 04:14:07 localhost systemd-logind[759]: Session 40 logged out. Waiting for processes to exit. Feb 23 04:14:07 localhost systemd-logind[759]: Removed session 40. Feb 23 04:14:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36096 DF PROTO=TCP SPT=37026 DPT=9102 SEQ=2924876512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE66A060000000001030307) Feb 23 04:14:11 localhost sshd[137344]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:14:12 localhost systemd-logind[759]: New session 41 of user zuul. Feb 23 04:14:12 localhost systemd[1]: Started Session 41 of User zuul. Feb 23 04:14:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1061 DF PROTO=TCP SPT=49698 DPT=9882 SEQ=1864574002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE676860000000001030307) Feb 23 04:14:13 localhost python3.9[137437]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:14:14 localhost python3.9[137531]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:14:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33838 DF PROTO=TCP SPT=38584 DPT=9105 SEQ=3783356907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE680060000000001030307) Feb 23 04:14:15 localhost python3.9[137684]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:14:16 localhost systemd[1]: session-41.scope: Deactivated successfully. Feb 23 04:14:16 localhost systemd[1]: session-41.scope: Consumed 2.264s CPU time. Feb 23 04:14:16 localhost systemd-logind[759]: Session 41 logged out. Waiting for processes to exit. Feb 23 04:14:16 localhost systemd-logind[759]: Removed session 41. Feb 23 04:14:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19953 DF PROTO=TCP SPT=41204 DPT=9101 SEQ=1234346066 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE68E070000000001030307) Feb 23 04:14:20 localhost sshd[137700]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:14:21 localhost sshd[137702]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:14:21 localhost systemd-logind[759]: New session 42 of user zuul. Feb 23 04:14:21 localhost systemd[1]: Started Session 42 of User zuul. Feb 23 04:14:22 localhost python3.9[137795]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:14:23 localhost python3.9[137889]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:14:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41070 DF PROTO=TCP SPT=55522 DPT=9102 SEQ=3278640626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6A3E00000000001030307) Feb 23 04:14:24 localhost python3.9[137985]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:14:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1063 DF PROTO=TCP SPT=49698 DPT=9882 SEQ=1864574002 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6A6060000000001030307) Feb 23 04:14:24 localhost sshd[137994]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:14:25 localhost python3.9[138041]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:14:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41072 DF PROTO=TCP SPT=55522 DPT=9102 SEQ=3278640626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6B0070000000001030307) Feb 23 04:14:29 localhost python3.9[138135]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:14:30 localhost python3.9[138290]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:14:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13771 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=4080666825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6BE460000000001030307) Feb 23 04:14:31 localhost python3.9[138382]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:14:32 localhost python3.9[138486]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:14:32 localhost python3.9[138534]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:14:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13772 DF PROTO=TCP SPT=43006 DPT=9100 SEQ=4080666825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6C6460000000001030307) Feb 23 04:14:33 localhost python3.9[138626]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:14:33 localhost python3.9[138674]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:14:34 localhost python3.9[138766]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 23 04:14:35 localhost python3.9[138858]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 23 04:14:35 localhost python3.9[138950]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 23 04:14:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39239 DF PROTO=TCP SPT=37120 DPT=9101 SEQ=249785914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6D3460000000001030307) Feb 23 04:14:36 localhost python3.9[139042]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Feb 23 04:14:37 localhost python3.9[139134]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:14:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64432 DF PROTO=TCP SPT=35190 DPT=9882 SEQ=2216662187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6DF930000000001030307) Feb 23 04:14:41 localhost python3.9[139228]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:14:42 localhost python3.9[139322]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:14:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64434 DF PROTO=TCP SPT=35190 DPT=9882 SEQ=2216662187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6EB870000000001030307) Feb 23 04:14:43 localhost python3.9[139414]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:14:43 localhost python3.9[139506]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:14:44 localhost python3.9[139599]: ansible-service_facts Invoked Feb 23 04:14:44 localhost network[139616]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:14:44 localhost network[139617]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:14:44 localhost network[139618]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:14:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58914 DF PROTO=TCP SPT=58300 DPT=9105 SEQ=3528398000 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE6F6060000000001030307) Feb 23 04:14:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:14:46 localhost sshd[139697]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:14:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39241 DF PROTO=TCP SPT=37120 DPT=9101 SEQ=249785914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE704070000000001030307) Feb 23 04:14:50 localhost python3.9[139941]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:14:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43892 DF PROTO=TCP SPT=40478 DPT=9102 SEQ=428619448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7190F0000000001030307) Feb 23 04:14:55 localhost python3.9[140035]: ansible-package_facts Invoked with manager=['auto'] strategy=first Feb 23 04:14:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64436 DF PROTO=TCP SPT=35190 DPT=9882 SEQ=2216662187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE71CF90000000001030307) Feb 23 04:14:56 localhost python3.9[140127]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:14:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43894 DF PROTO=TCP SPT=40478 DPT=9102 SEQ=428619448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE725070000000001030307) Feb 23 04:14:57 localhost python3.9[140202]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838096.005159-651-221120623885301/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:14:58 localhost python3.9[140296]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:14:58 localhost python3.9[140371]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838097.5552204-696-107666057157854/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:00 localhost python3.9[140465]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19694 DF PROTO=TCP SPT=38358 DPT=9100 SEQ=2257578880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE733860000000001030307) Feb 23 04:15:01 localhost sshd[140517]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:15:01 localhost python3.9[140560]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:15:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19695 DF PROTO=TCP SPT=38358 DPT=9100 SEQ=2257578880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE73B870000000001030307) Feb 23 04:15:03 localhost python3.9[140615]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:15:04 localhost sshd[140673]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:15:05 localhost python3.9[140781]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:15:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25763 DF PROTO=TCP SPT=33488 DPT=9101 SEQ=318659141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE748860000000001030307) Feb 23 04:15:06 localhost python3.9[140842]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:15:06 localhost chronyd[25974]: chronyd exiting Feb 23 04:15:06 localhost systemd[1]: Stopping NTP client/server... Feb 23 04:15:06 localhost systemd[1]: chronyd.service: Deactivated successfully. Feb 23 04:15:06 localhost systemd[1]: Stopped NTP client/server. Feb 23 04:15:06 localhost systemd[1]: Starting NTP client/server... Feb 23 04:15:06 localhost chronyd[140850]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Feb 23 04:15:06 localhost chronyd[140850]: Frequency -30.764 +/- 0.375 ppm read from /var/lib/chrony/drift Feb 23 04:15:06 localhost chronyd[140850]: Loaded seccomp filter (level 2) Feb 23 04:15:06 localhost systemd[1]: Started NTP client/server. Feb 23 04:15:07 localhost systemd[1]: session-42.scope: Deactivated successfully. Feb 23 04:15:07 localhost systemd[1]: session-42.scope: Consumed 29.434s CPU time. Feb 23 04:15:07 localhost systemd-logind[759]: Session 42 logged out. Waiting for processes to exit. Feb 23 04:15:07 localhost systemd-logind[759]: Removed session 42. Feb 23 04:15:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47709 DF PROTO=TCP SPT=49170 DPT=9882 SEQ=3272426637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE754C30000000001030307) Feb 23 04:15:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47711 DF PROTO=TCP SPT=49170 DPT=9882 SEQ=3272426637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE760C60000000001030307) Feb 23 04:15:12 localhost sshd[140866]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:15:12 localhost systemd-logind[759]: New session 43 of user zuul. Feb 23 04:15:12 localhost systemd[1]: Started Session 43 of User zuul. Feb 23 04:15:13 localhost python3.9[140959]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:15:15 localhost python3.9[141055]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13448 DF PROTO=TCP SPT=57840 DPT=9105 SEQ=1310932017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE76C060000000001030307) Feb 23 04:15:15 localhost python3.9[141160]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:16 localhost python3.9[141208]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.bwt_xnhl recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:17 localhost python3.9[141300]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:18 localhost python3.9[141375]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838116.8482292-138-210665163821427/.source _original_basename=.n39d2bru follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25765 DF PROTO=TCP SPT=33488 DPT=9101 SEQ=318659141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE778060000000001030307) Feb 23 04:15:18 localhost python3.9[141467]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:15:19 localhost python3.9[141559]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:19 localhost python3.9[141632]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838118.8802204-210-248966440261134/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 04:15:20 localhost python3.9[141724]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:21 localhost python3.9[141797]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838120.0959597-210-252500718927505/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 04:15:21 localhost python3.9[141889]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:22 localhost python3.9[141981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:23 localhost python3.9[142054]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838121.9389346-321-262015738322171/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:23 localhost python3.9[142146]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15057 DF PROTO=TCP SPT=44814 DPT=9102 SEQ=461662555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE78E400000000001030307) Feb 23 04:15:24 localhost python3.9[142219]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838123.1881382-366-34374967328336/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47713 DF PROTO=TCP SPT=49170 DPT=9882 SEQ=3272426637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE790070000000001030307) Feb 23 04:15:25 localhost python3.9[142311]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:15:25 localhost systemd[1]: Reloading. Feb 23 04:15:25 localhost systemd-sysv-generator[142334]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:15:25 localhost systemd-rc-local-generator[142328]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:15:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:15:25 localhost systemd[1]: Reloading. Feb 23 04:15:25 localhost systemd-rc-local-generator[142376]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:15:25 localhost systemd-sysv-generator[142379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:15:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:15:26 localhost systemd[1]: Starting EDPM Container Shutdown... Feb 23 04:15:26 localhost systemd[1]: Finished EDPM Container Shutdown. Feb 23 04:15:26 localhost python3.9[142479]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15059 DF PROTO=TCP SPT=44814 DPT=9102 SEQ=461662555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE79A460000000001030307) Feb 23 04:15:27 localhost python3.9[142552]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838126.2247381-435-227566872375162/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:27 localhost python3.9[142644]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:28 localhost python3.9[142717]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838127.4125059-480-110735526367212/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:29 localhost python3.9[142809]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:15:29 localhost systemd[1]: Reloading. Feb 23 04:15:29 localhost systemd-rc-local-generator[142829]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:15:29 localhost systemd-sysv-generator[142834]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:15:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:15:29 localhost systemd[1]: Starting Create netns directory... Feb 23 04:15:29 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 04:15:29 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 04:15:29 localhost systemd[1]: Finished Create netns directory. Feb 23 04:15:30 localhost python3.9[142940]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:15:30 localhost network[142957]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:15:30 localhost network[142958]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:15:30 localhost network[142959]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:15:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36146 DF PROTO=TCP SPT=52040 DPT=9100 SEQ=1005657160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7A8C70000000001030307) Feb 23 04:15:32 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:15:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36147 DF PROTO=TCP SPT=52040 DPT=9100 SEQ=1005657160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7B0C60000000001030307) Feb 23 04:15:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1403 DF PROTO=TCP SPT=49826 DPT=9101 SEQ=901113687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7BD860000000001030307) Feb 23 04:15:36 localhost python3.9[143160]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:37 localhost python3.9[143235]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838136.183265-603-107077308182534/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:37 localhost sshd[143259]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:15:38 localhost python3.9[143329]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:15:38 localhost systemd[1]: Reloading OpenSSH server daemon... Feb 23 04:15:38 localhost systemd[1]: Reloaded OpenSSH server daemon. Feb 23 04:15:38 localhost sshd[122114]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:15:39 localhost python3.9[143426]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34233 DF PROTO=TCP SPT=37460 DPT=9882 SEQ=1628761315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7C9F30000000001030307) Feb 23 04:15:39 localhost python3.9[143518]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:40 localhost python3.9[143591]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838139.3459609-696-193298550226613/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:41 localhost python3.9[143683]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Feb 23 04:15:41 localhost systemd[1]: Starting Time & Date Service... Feb 23 04:15:41 localhost systemd[1]: Started Time & Date Service. Feb 23 04:15:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34235 DF PROTO=TCP SPT=37460 DPT=9882 SEQ=1628761315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7D6060000000001030307) Feb 23 04:15:42 localhost python3.9[143779]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:43 localhost python3.9[143871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:43 localhost sshd[143942]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:15:43 localhost python3.9[143945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838142.8002887-801-231261638333350/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:44 localhost python3.9[144038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8906 DF PROTO=TCP SPT=58462 DPT=9105 SEQ=299049209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7E0060000000001030307) Feb 23 04:15:45 localhost python3.9[144111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838143.9978013-846-148658485450048/.source.yaml _original_basename=.1a7jpgsw follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:45 localhost python3.9[144203]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:46 localhost python3.9[144278]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838145.2218115-891-96936750585835/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:47 localhost python3.9[144370]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:15:47 localhost python3.9[144463]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:15:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1405 DF PROTO=TCP SPT=49826 DPT=9101 SEQ=901113687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE7EE060000000001030307) Feb 23 04:15:48 localhost python3[144556]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 23 04:15:49 localhost python3.9[144648]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:50 localhost python3.9[144721]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838148.9605737-1008-266884096942049/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:50 localhost python3.9[144813]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:51 localhost python3.9[144886]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838150.2007642-1053-79509395947350/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:51 localhost python3.9[144978]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:52 localhost python3.9[145051]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838151.4395404-1098-24111126058740/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:53 localhost python3.9[145143]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:53 localhost python3.9[145216]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838152.6633744-1143-228957053568640/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11804 DF PROTO=TCP SPT=42786 DPT=9102 SEQ=2336433841 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE803700000000001030307) Feb 23 04:15:54 localhost python3.9[145308]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:15:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34237 DF PROTO=TCP SPT=37460 DPT=9882 SEQ=1628761315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE806070000000001030307) Feb 23 04:15:54 localhost python3.9[145381]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838153.8990192-1188-61025452854529/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:55 localhost python3.9[145473]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:56 localhost python3.9[145565]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:15:57 localhost python3.9[145660]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:58 localhost python3.9[145753]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43898 DF PROTO=TCP SPT=40478 DPT=9102 SEQ=428619448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE814070000000001030307) Feb 23 04:15:58 localhost python3.9[145845]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:15:59 localhost python3.9[145937]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Feb 23 04:16:00 localhost python3.9[146030]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Feb 23 04:16:00 localhost systemd[1]: session-43.scope: Deactivated successfully. Feb 23 04:16:00 localhost systemd-logind[759]: Session 43 logged out. Waiting for processes to exit. Feb 23 04:16:00 localhost systemd[1]: session-43.scope: Consumed 30.023s CPU time. Feb 23 04:16:00 localhost systemd-logind[759]: Removed session 43. Feb 23 04:16:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36150 DF PROTO=TCP SPT=52040 DPT=9100 SEQ=1005657160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE820070000000001030307) Feb 23 04:16:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45855 DF PROTO=TCP SPT=45286 DPT=9101 SEQ=1933992049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE826CC0000000001030307) Feb 23 04:16:06 localhost sshd[146108]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:16:06 localhost systemd-logind[759]: New session 44 of user zuul. Feb 23 04:16:06 localhost systemd[1]: Started Session 44 of User zuul. Feb 23 04:16:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25767 DF PROTO=TCP SPT=33488 DPT=9101 SEQ=318659141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE836060000000001030307) Feb 23 04:16:07 localhost python3.9[146218]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Feb 23 04:16:09 localhost python3.9[146310]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:16:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29843 DF PROTO=TCP SPT=43632 DPT=9882 SEQ=163079574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE83F230000000001030307) Feb 23 04:16:10 localhost python3.9[146404]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Feb 23 04:16:11 localhost python3.9[146496]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.w2vcy2ch follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:16:11 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Feb 23 04:16:12 localhost python3.9[146573]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.w2vcy2ch mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838171.1422215-189-233192297255079/.source.w2vcy2ch _original_basename=.iiut_32_ follow=False checksum=d1d6d40786432d7ee1aec581e269930dfc2795e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:16:13 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47715 DF PROTO=TCP SPT=49170 DPT=9882 SEQ=3272426637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE84E070000000001030307) Feb 23 04:16:14 localhost python3.9[146665]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:16:15 localhost sshd[146721]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:16:15 localhost python3.9[146759]: ansible-ansible.builtin.blockinfile Invoked with block=np0005626466.localdomain,192.168.122.108,np0005626466* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD4dg5LfbOyIHJudQjfDyIcqYXRqMUeYQIpjQPmNS0Tl7/EpBaYixjqlNovKIWOwkS4E2n4hwPLSTGSihYb5BeUDw32T80RumycS2tjBCSLiuq93xpTOaL2X+7wykkOSfY5xya13qrTg0ROJip0B6PSSF+Rn28SAKLh91euCdRaxWTAMeOSTP9WeCA3d0gsgb4xSMMWZxR4o1BU2bixjAcJHAlKYDc1OGpKkirRoziu9Y4nq2lmbwTg5HiS8STVkqyGHba9k6IC0eF2ZmT6M2thoHatYVtjuUeEE9bSvaAFB8oSI9Np6+OaluvuoKJYjRA3dzEQOi4ft/wwUrJfvyypDAxKBkxo7lCWIDEBK5Zb9BVoo68psz2IVPNGNZJtKXiq58CAqZTR02l/wEq4wB1/hp7ZW+ZMnHQUq1FpGITIA89KZeL9xNlnHqYak58B2GCYgK6OdvWktr4WHN8nbEmwZvaTrijZvnww7h2FQG4BMcSlO6AWKAdjksJZlVDYLJs=#012np0005626466.localdomain,192.168.122.108,np0005626466* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIiaRdmYDJrMg8atO+fnuqzJdDL1JaVGt341/g0QTv04#012np0005626466.localdomain,192.168.122.108,np0005626466* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGMyxprJk2KMNU4/eWUo8EdX2W79HO4pGHl3Ze8LEhDdSbCzY8uy6KD6met+RL0bD767zsXbqEV/9peHg1x5qjM=#012np0005626459.localdomain,192.168.122.103,np0005626459* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9VsrIfV6Z4AiMtHfmjOpcBCt5sMsGmP0fOSak1UBP4r9lW4eYyoJY7Rtt1LDAcbGqdL3Nh3yc8ub0ekpXF6MA0vKucLb+jtjexv6t21W2grJ+ucwsvDhTDhDXmOUwD5G7A9Zj2WDqt/DN4DxeEqvQ6v1dSQaG+17BVPvM7mhgd5CSYOdUphCC81TPZgj3xyK31Q89biIS6pCBSKnsyN7qcU38bFGvRN0sTFaFt9KrIUfJJdcAZudw5Q/R775pmaaeHTSVPL05gE7dyz8RicEpenh6X0aZCOVt0+4VBnfXXSIL9QIwjrarPPKRdtmQY7dZ3dVNI1ZWA5YOl0y6R3fmxaRV5y1ZkDW6vG0463hYjKaAVqILAAPZGzhuzL7/1zxIv0guUB58tOUrCkkPIRzd6NQLL2j8L7RLIj3bZjG2xf0WiierxPsCEhl3wmdIVRUReE6jYalNGlscGUr1JWproKoaQqfck0OWhGy7jCCe8Gd8a/pr7jtg+X3bEMQ3HAc=#012np0005626459.localdomain,192.168.122.103,np0005626459* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIB1G62+/VP1cWp/d17CbWxlG5w4IEqmUSSc9SyShSsKo#012np0005626459.localdomain,192.168.122.103,np0005626459* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBF1G+CYZWMPROBz875F8bjcexPOeozjteUw/Fu+xHwwpYK4DPmCNq+JbW1AmCaltVkHRnMMPqLBom+3c+ekTh4E=#012np0005626460.localdomain,192.168.122.104,np0005626460* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCeQmwl5IUCA7h6xphf+o3WARi0Xlj+0K08ltN/FCX7iF0EALCfDqtKOHz7wv5gS04Zx4aeNfcVHv9bHLRJxTPzliSNVutqA7vdFa0R/kRMdNzkqSOCuJ64sQ8GwSOHSrcFy7qC87BuP6xB9atSBjpAEB4NZOuXbvmSN/dCa/nNpUWoWNNg3eR5AalrExCptFYZ4E7YWvJ6HdZpr1QhcAJW0V1y4+u4FfzxHT2SQfGmua4TFHH1lUMiMrgAoELLe+pYdnWooEhRlkPulWy/wOyNz7aCCDP462XBhCc0CmiBDRwMBaJISck1pJCOIksvu8TYa6Fp8aayZqJvbUJYl5C1Z/o+zgHMTjeec0Th5GIuw9XUJkkx8TT5Fh7aWJvX9BbHlMaJjAqc+G/wiIImvKlsuIsovU6TH0P/XiysoWXeUWM7JqR8Y/05+yELy+xAMKT7PfEXE1fWOlGcCJsarLYGhh/7Jypwfh8Y/wOtYdKOGODxDnzq2f2VySsEiAf0EL0=#012np0005626460.localdomain,192.168.122.104,np0005626460* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILDN/X/h1SJivdlJg6UrBmlF7YgESQ24kCjH//omBjn3#012np0005626460.localdomain,192.168.122.104,np0005626460* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLjVObKHLJCn+kOorv0tRLu5M/EwGgxQnczR69veoTwgXNRB/xCzi30v7fJ2uWbQGJXou02P5IiwAQmFSv1vKpE=#012np0005626465.localdomain,192.168.122.107,np0005626465* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCUc8l2oYgfdO7xb3vN27co3Q/sFNU6Rw5wThiW1JMfeIzI90ZzS/L+BpsDsX8q2CW9QOHXrbUormpGsiNnix5j1P29Jc6e9A2mDlipXBrFSUiVZa8UOL03lFSz4nElapkASin2GCdHqy7//gGdQMKRP62VXpdhofb7i/N/gGoV5hSc8Q36KFDbWpvPkhD5H8nZtAfyxM99KwlC62D8jSN+gdoRtMRFPQTtyvyskyrgnXGC6xV71WTa6LJ6Meo7tfj4JlvDAWwlD+f9Ruu2ty2aHd2feVVKYvxZ4Z45iSfJnNxRFJvu1QOY0IU4Fj942leKwr6f0B5ogPFlTI7wRrAB1d9tri1WW2aL1AqYhdZscWi0VArYxLQr7BCVqz8KgFIzjbPoJ7uYnWcuDSiWlC1NJVO7Ij2natf8wZyvSyH+vydamkyoaNwxMnm4qs0/rvjwL49MdrHB79rXjHYJpt/JCBvn9a/rh5KqVH40P00DP35H71zyHPCSu1L20S/wY1k=#012np0005626465.localdomain,192.168.122.107,np0005626465* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMNM6I52u2PlIbUuPV1wF+vgd5UIhGpYLByAkJDxsiFm#012np0005626465.localdomain,192.168.122.107,np0005626465* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOxvbePT9GQElB7TGQuLRzkjxtXeKA7IbYbWBmgWolf09tVtPZHcG12wdG6fePoATmwyX4PIJb5sC28KiqtOgIE=#012np0005626461.localdomain,192.168.122.105,np0005626461* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDBCzU24t9gA5R+exm4rHJ2VytHuq8uUoKuu6SZ07dskKR77n7TwlsZhsDjpzwsddHd+lvsfvOVmolxjJsCmq7LJRMGA/mczHXsGGb43YPZPKsiJ6KMPDORy5/ihhnqixBYVmBGtdPu/Hh/udGnymZgR/RYGltDDHoCfGGiEcHJSIuf/Bv2Uv4xFnxFjDrWQFrkJ5Grq1xC7cGXgC3gAiTCjGHkG9rb/oyTUjjM8LaaRYIjeoDQZu1/8y5pl6cnhW21VTA+u55SkSimb/g5oOuSmrv899iHFwb54uLINXvA4aTtduUnxNQBVRyFvWa3yCZXVJeYlcVP8Q9tljn9anN1aISnS311Jmay6zUY927bxnzrpkwaV7Ggwtvi6vlVy84ZvOJ/IJ2boDiMujh1ZpT3bxXG3Oy0EjfBVbpkS6r2MbGTPj/xWnosJ6JNVbb9LW7Ftfi3/NFfAb7PpTgY036DA8LYoYIfqxVJUhlo5fJjqqOLa/zbvZVwrFCG+Zm160=#012np0005626461.localdomain,192.168.122.105,np0005626461* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAID3BKJ5iitZOMOyRmWwrIHEgrBaSUAXcN/yddsH5p67P#012np0005626461.localdomain,192.168.122.105,np0005626461* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCX+ELPnNre0Bl1NdaYE8R/rtodFHjWfK7n06TW2wvAyLhge/A+53E2vGTXA9jfYXEEH2g0XKcYHlkb3dM70CTQ=#012np0005626463.localdomain,192.168.122.106,np0005626463* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/Caj4zYKd24ctvaRU1Hf9nT058OF4bRnDJ3bHimmkyIL7cccXAxo3lx50wZHWRYBhF5Wes6TmqnUTTK1h5wVdI8f7YtQ9IyMIlfoEiTThF5PgODVuRYq+YGjFIy7MTPyBnB2428aT4dlYqHSuxK2gL6ALlCJHNyeh3RW3jCOG89veDoRmbqHGoaD+xPRnfsdHLoLFNfxT4UJiKRuqsEd5fNtc392ROSa5XM3PPIs3YTypYmpfFHs1B1j+y6oZV8Ha/QXqURpI7/aJmfnDzXLMsLWp4GRpkwzljvNp87S5HL+kJMo79n0Vmh2JdN1orNP/4A2t/TENckHbrZCm+YmPqUqvpHkAZfFfmvP62YZTPq/qOjBMMq6ulGSHd2I4XfE7NNZRKoS3G4HVlBb0ONS13PaWx9rrJCRlF64L1dHSt9zpKrvRbWkSdXA0PwwehrU5/OBo1IY4WsRlWmPeET1/dFWiIr1t9uGjp5vmACAx7rnC6G5qSEhQ3/k1Wa57k/k=#012np0005626463.localdomain,192.168.122.106,np0005626463* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPpIpPeSZdEjLEgb7zYHVhKnwBDipROOVgmUJe3QzecH#012np0005626463.localdomain,192.168.122.106,np0005626463* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJUV/eK8X671P+PPyOxoifS2hhEKYup7ygc301iPJDoOs3TgLodw2jNy/egXEc0x3WdkTwXltmBlHqmWw5ro05Q=#012 create=True mode=0644 path=/tmp/ansible.w2vcy2ch state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:16:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8907 DF PROTO=TCP SPT=58462 DPT=9105 SEQ=299049209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE85E060000000001030307) Feb 23 04:16:17 localhost python3.9[146851]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.w2vcy2ch' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:16:18 localhost python3.9[146945]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.w2vcy2ch state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:16:19 localhost systemd-logind[759]: Session 44 logged out. Waiting for processes to exit. Feb 23 04:16:19 localhost systemd[1]: session-44.scope: Deactivated successfully. Feb 23 04:16:19 localhost systemd[1]: session-44.scope: Consumed 4.375s CPU time. Feb 23 04:16:19 localhost systemd-logind[759]: Removed session 44. Feb 23 04:16:22 localhost sshd[146960]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:16:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33432 DF PROTO=TCP SPT=57960 DPT=9102 SEQ=315792718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE878A10000000001030307) Feb 23 04:16:24 localhost sshd[146962]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:16:24 localhost systemd-logind[759]: New session 45 of user zuul. Feb 23 04:16:24 localhost systemd[1]: Started Session 45 of User zuul. Feb 23 04:16:25 localhost python3.9[147055]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:16:27 localhost python3.9[147151]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 23 04:16:28 localhost python3.9[147245]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:16:29 localhost python3.9[147338]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:16:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45380 DF PROTO=TCP SPT=38814 DPT=9100 SEQ=2662443519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE88F160000000001030307) Feb 23 04:16:30 localhost python3.9[147431]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:16:31 localhost python3.9[147525]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:16:32 localhost python3.9[147620]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:16:32 localhost systemd[1]: session-45.scope: Deactivated successfully. Feb 23 04:16:32 localhost systemd[1]: session-45.scope: Consumed 3.890s CPU time. Feb 23 04:16:32 localhost systemd-logind[759]: Session 45 logged out. Waiting for processes to exit. Feb 23 04:16:32 localhost systemd-logind[759]: Removed session 45. Feb 23 04:16:33 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51162 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE89BFB0000000001030307) Feb 23 04:16:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51163 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8A0060000000001030307) Feb 23 04:16:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51164 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8A8070000000001030307) Feb 23 04:16:37 localhost sshd[147635]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:16:37 localhost systemd-logind[759]: New session 46 of user zuul. Feb 23 04:16:37 localhost systemd[1]: Started Session 46 of User zuul. Feb 23 04:16:38 localhost python3.9[147728]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:16:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44083 DF PROTO=TCP SPT=51532 DPT=9882 SEQ=3768975670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8B4530000000001030307) Feb 23 04:16:39 localhost python3.9[147824]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:16:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51165 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8B7C60000000001030307) Feb 23 04:16:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44084 DF PROTO=TCP SPT=51532 DPT=9882 SEQ=3768975670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8B8460000000001030307) Feb 23 04:16:40 localhost python3.9[147878]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Feb 23 04:16:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44085 DF PROTO=TCP SPT=51532 DPT=9882 SEQ=3768975670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8C0460000000001030307) Feb 23 04:16:45 localhost python3.9[147970]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:16:46 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3622 DF PROTO=TCP SPT=49526 DPT=9105 SEQ=1733835990 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8D0060000000001030307) Feb 23 04:16:46 localhost python3.9[148063]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:16:47 localhost python3.9[148155]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:16:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51166 DF PROTO=TCP SPT=52072 DPT=9101 SEQ=2960822016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8D8070000000001030307) Feb 23 04:16:48 localhost python3.9[148247]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:16:49 localhost python3.9[148337]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 23 04:16:50 localhost python3.9[148427]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:16:51 localhost python3.9[148519]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:16:51 localhost systemd[1]: session-46.scope: Deactivated successfully. Feb 23 04:16:51 localhost systemd[1]: session-46.scope: Consumed 9.001s CPU time. Feb 23 04:16:51 localhost systemd-logind[759]: Session 46 logged out. Waiting for processes to exit. Feb 23 04:16:51 localhost systemd-logind[759]: Removed session 46. Feb 23 04:16:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60006 DF PROTO=TCP SPT=52864 DPT=9102 SEQ=4217256930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8EDD10000000001030307) Feb 23 04:16:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44087 DF PROTO=TCP SPT=51532 DPT=9882 SEQ=3768975670 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8F0060000000001030307) Feb 23 04:16:57 localhost sshd[148536]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:16:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60008 DF PROTO=TCP SPT=52864 DPT=9102 SEQ=4217256930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE8F9C70000000001030307) Feb 23 04:16:57 localhost sshd[148537]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:16:57 localhost systemd-logind[759]: New session 47 of user zuul. Feb 23 04:16:57 localhost systemd[1]: Started Session 47 of User zuul. Feb 23 04:16:58 localhost python3.9[148631]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:17:00 localhost python3.9[148727]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52483 DF PROTO=TCP SPT=55562 DPT=9100 SEQ=984419830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE908460000000001030307) Feb 23 04:17:00 localhost sshd[148787]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:17:01 localhost python3.9[148821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:02 localhost python3.9[148894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838220.7216723-177-270581568256455/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:02 localhost python3.9[148986]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52484 DF PROTO=TCP SPT=55562 DPT=9100 SEQ=984419830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE910460000000001030307) Feb 23 04:17:03 localhost python3.9[149078]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:03 localhost python3.9[149151]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838222.8237774-250-43749575388980/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:04 localhost python3.9[149243]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:05 localhost python3.9[149335]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:05 localhost python3.9[149408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838224.682711-323-105024199746830/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61162 DF PROTO=TCP SPT=44394 DPT=9101 SEQ=2399873393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE91D470000000001030307) Feb 23 04:17:06 localhost python3.9[149500]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:07 localhost python3.9[149592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:07 localhost python3.9[149695]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838226.6899624-396-72216657166734/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:08 localhost python3.9[149818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:08 localhost python3.9[149925]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28238 DF PROTO=TCP SPT=56366 DPT=9882 SEQ=1366371807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE929820000000001030307) Feb 23 04:17:09 localhost python3.9[149998]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838228.5334349-470-123841446130468/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:10 localhost python3.9[150090]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:10 localhost python3.9[150182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:11 localhost python3.9[150255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838230.372813-536-237209744266713/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:12 localhost python3.9[150347]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28240 DF PROTO=TCP SPT=56366 DPT=9882 SEQ=1366371807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE935860000000001030307) Feb 23 04:17:12 localhost python3.9[150440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:13 localhost python3.9[150513]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838232.2018623-611-25720325986337/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:13 localhost python3.9[150605]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:14 localhost python3.9[150697]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:15 localhost python3.9[150770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838233.9995575-683-205098696455471/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=704aa8bdbea515a6da96c2b63bce412faf6bceda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52486 DF PROTO=TCP SPT=55562 DPT=9100 SEQ=984419830 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE940060000000001030307) Feb 23 04:17:15 localhost systemd[1]: session-47.scope: Deactivated successfully. Feb 23 04:17:15 localhost systemd[1]: session-47.scope: Consumed 11.729s CPU time. Feb 23 04:17:15 localhost systemd-logind[759]: Session 47 logged out. Waiting for processes to exit. Feb 23 04:17:15 localhost systemd-logind[759]: Removed session 47. Feb 23 04:17:17 localhost chronyd[140850]: Selected source 167.160.187.12 (pool.ntp.org) Feb 23 04:17:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61164 DF PROTO=TCP SPT=44394 DPT=9101 SEQ=2399873393 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE94E060000000001030307) Feb 23 04:17:21 localhost sshd[150786]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:17:21 localhost systemd-logind[759]: New session 48 of user zuul. Feb 23 04:17:21 localhost systemd[1]: Started Session 48 of User zuul. Feb 23 04:17:22 localhost python3.9[150881]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:22 localhost python3.9[150973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:23 localhost python3.9[151046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838242.228574-57-235985671653714/.source.conf _original_basename=ceph.conf follow=False checksum=00be6682e39722cc7ebf9f74611435726ea0928d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4671 DF PROTO=TCP SPT=36964 DPT=9102 SEQ=817969717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE962FF0000000001030307) Feb 23 04:17:24 localhost python3.9[151138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28242 DF PROTO=TCP SPT=56366 DPT=9882 SEQ=1366371807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE966060000000001030307) Feb 23 04:17:25 localhost python3.9[151211]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838243.686841-57-272677179248303/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=bb97f2335ebfccbfb2bd8d50bbb589ce7e034c5d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:25 localhost systemd[1]: session-48.scope: Deactivated successfully. Feb 23 04:17:25 localhost systemd[1]: session-48.scope: Consumed 2.323s CPU time. Feb 23 04:17:25 localhost systemd-logind[759]: Session 48 logged out. Waiting for processes to exit. Feb 23 04:17:25 localhost systemd-logind[759]: Removed session 48. Feb 23 04:17:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4673 DF PROTO=TCP SPT=36964 DPT=9102 SEQ=817969717 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE96F060000000001030307) Feb 23 04:17:30 localhost sshd[151226]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:17:30 localhost systemd-logind[759]: New session 49 of user zuul. Feb 23 04:17:30 localhost systemd[1]: Started Session 49 of User zuul. Feb 23 04:17:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50481 DF PROTO=TCP SPT=60186 DPT=9100 SEQ=4212654414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE97D860000000001030307) Feb 23 04:17:31 localhost sshd[151320]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:17:31 localhost python3.9[151319]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:17:32 localhost python3.9[151417]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50482 DF PROTO=TCP SPT=60186 DPT=9100 SEQ=4212654414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE985870000000001030307) Feb 23 04:17:33 localhost python3.9[151509]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:17:34 localhost python3.9[151599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:17:35 localhost python3.9[151691]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 23 04:17:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2755 DF PROTO=TCP SPT=44236 DPT=9101 SEQ=3556866097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE992470000000001030307) Feb 23 04:17:36 localhost python3.9[151783]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:17:37 localhost python3.9[151837]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:17:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=931 DF PROTO=TCP SPT=51646 DPT=9882 SEQ=1751365345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE99EB30000000001030307) Feb 23 04:17:40 localhost sshd[151840]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:17:41 localhost python3.9[151933]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:17:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=933 DF PROTO=TCP SPT=51646 DPT=9882 SEQ=1751365345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9AAC60000000001030307) Feb 23 04:17:43 localhost python3[152028]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Feb 23 04:17:44 localhost python3.9[152120]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28152 DF PROTO=TCP SPT=58040 DPT=9105 SEQ=955673459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9B6060000000001030307) Feb 23 04:17:45 localhost python3.9[152212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:45 localhost python3.9[152260]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:46 localhost python3.9[152352]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:47 localhost python3.9[152400]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.e6yqn866 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:47 localhost python3.9[152492]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:48 localhost python3.9[152540]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2757 DF PROTO=TCP SPT=44236 DPT=9101 SEQ=3556866097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9C2060000000001030307) Feb 23 04:17:48 localhost python3.9[152632]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:17:49 localhost python3[152725]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 23 04:17:51 localhost python3.9[152817]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:51 localhost python3.9[152892]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838270.7627308-426-46994912217770/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:53 localhost python3.9[152984]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:53 localhost python3.9[153059]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838272.660234-471-239901030407009/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14301 DF PROTO=TCP SPT=50292 DPT=9102 SEQ=3909814091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9D8300000000001030307) Feb 23 04:17:54 localhost python3.9[153151]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=935 DF PROTO=TCP SPT=51646 DPT=9882 SEQ=1751365345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9DA070000000001030307) Feb 23 04:17:55 localhost python3.9[153226]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838273.916604-516-233012937398515/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:55 localhost python3.9[153318]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:56 localhost python3.9[153393]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838275.1686063-561-164450185047193/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:57 localhost python3.9[153485]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:17:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14303 DF PROTO=TCP SPT=50292 DPT=9102 SEQ=3909814091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9E4460000000001030307) Feb 23 04:17:57 localhost python3.9[153560]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838276.594574-606-50614896584458/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:58 localhost python3.9[153652]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:17:59 localhost python3.9[153744]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:18:00 localhost python3.9[153839]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35684 DF PROTO=TCP SPT=60332 DPT=9100 SEQ=3859742910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9F2C60000000001030307) Feb 23 04:18:01 localhost python3.9[153931]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:18:02 localhost python3.9[154024]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:18:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35685 DF PROTO=TCP SPT=60332 DPT=9100 SEQ=3859742910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BE9FAC60000000001030307) Feb 23 04:18:03 localhost python3.9[154118]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:18:04 localhost python3.9[154213]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:05 localhost python3.9[154303]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:18:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34093 DF PROTO=TCP SPT=32832 DPT=9101 SEQ=1727785860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA07860000000001030307) Feb 23 04:18:07 localhost python3.9[154396]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005626463.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:6e:1d:57:37" external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:18:07 localhost ovs-vsctl[154397]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005626463.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:6e:1d:57:37 external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Feb 23 04:18:08 localhost python3.9[154489]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:18:08 localhost python3.9[154582]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:18:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62695 DF PROTO=TCP SPT=45470 DPT=9882 SEQ=1824332054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA13E30000000001030307) Feb 23 04:18:09 localhost python3.9[154706]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:10 localhost python3.9[154830]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:10 localhost python3.9[154893]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:11 localhost python3.9[154985]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:11 localhost sshd[154988]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:18:12 localhost python3.9[155035]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62697 DF PROTO=TCP SPT=45470 DPT=9882 SEQ=1824332054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA20060000000001030307) Feb 23 04:18:12 localhost python3.9[155127]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:14 localhost python3.9[155219]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:14 localhost python3.9[155267]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35687 DF PROTO=TCP SPT=60332 DPT=9100 SEQ=3859742910 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA2A060000000001030307) Feb 23 04:18:15 localhost python3.9[155359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:15 localhost python3.9[155407]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:16 localhost python3.9[155499]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:18:16 localhost systemd[1]: Reloading. Feb 23 04:18:17 localhost systemd-rc-local-generator[155521]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:18:17 localhost systemd-sysv-generator[155525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:18:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:18:17 localhost python3.9[155628]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:18 localhost python3.9[155676]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34095 DF PROTO=TCP SPT=32832 DPT=9101 SEQ=1727785860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA38060000000001030307) Feb 23 04:18:18 localhost python3.9[155768]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:19 localhost python3.9[155816]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:19 localhost sshd[155831]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:18:20 localhost python3.9[155910]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:18:20 localhost systemd[1]: Reloading. Feb 23 04:18:20 localhost systemd-rc-local-generator[155933]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:18:20 localhost systemd-sysv-generator[155940]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:18:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:18:21 localhost systemd[1]: Starting Create netns directory... Feb 23 04:18:21 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 04:18:21 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 04:18:21 localhost systemd[1]: Finished Create netns directory. Feb 23 04:18:22 localhost python3.9[156044]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:23 localhost python3.9[156136]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:23 localhost python3.9[156209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838302.7522328-1341-27714097638205/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10955 DF PROTO=TCP SPT=51976 DPT=9102 SEQ=101057569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA4D600000000001030307) Feb 23 04:18:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62699 DF PROTO=TCP SPT=45470 DPT=9882 SEQ=1824332054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA50060000000001030307) Feb 23 04:18:25 localhost python3.9[156301]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:26 localhost python3.9[156393]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:26 localhost python3.9[156485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10957 DF PROTO=TCP SPT=51976 DPT=9102 SEQ=101057569 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA59870000000001030307) Feb 23 04:18:27 localhost python3.9[156560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838306.3763897-1440-44078644384584/.source.json _original_basename=.kvvmbsge follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:27 localhost python3.9[156650]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52734 DF PROTO=TCP SPT=39394 DPT=9100 SEQ=1762409738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA63D60000000001030307) Feb 23 04:18:30 localhost python3.9[156903]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Feb 23 04:18:31 localhost python3.9[156995]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 23 04:18:32 localhost python3[157087]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False Feb 23 04:18:32 localhost python3[157087]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "bfb93be9d83c3121be0312d4d8c02944841d931c726f68b412221913286262d4",#012 "Digest": "sha256:5a01d6902fcff84f31d264784a24433f1266e51e84e70ca3796953855fdec417",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:5a01d6902fcff84f31d264784a24433f1266e51e84e70ca3796953855fdec417"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-02-23T06:34:22.194153324Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 347092937,#012 "VirtualSize": 347092937,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/f3afd1cf5e6198a170887a65c5f10af446afae7f60b1c2348209fc3be458dddf/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",#012 "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",#012 "sha256:4488e457e941888ff222080c5c98fc98b827e2e0699d850c0a8b0f12f152d8f5",#012 "sha256:bde1ac8945157434308ea323cfa7054085e8af54598c165ad28f8de2052547eb"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2026-02-17T01:25:07.246646992Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:07.246739119Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260216\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:12.132997501Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-02-23T06:08:39.081651802Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081666472Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081677733Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081688343Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081701553Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081710413Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.413481757Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:09:13.490649497Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:09:16.454967918Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util- Feb 23 04:18:32 localhost podman[157138]: 2026-02-23 09:18:32.843551039 +0000 UTC m=+0.092819704 container remove 1a34372b503b38a7a4a9fbe2dded8f39f23d7890b6ab89723f700fc71ccd8b6e (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c) Feb 23 04:18:32 localhost python3[157087]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Feb 23 04:18:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52736 DF PROTO=TCP SPT=39394 DPT=9100 SEQ=1762409738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA6FC70000000001030307) Feb 23 04:18:32 localhost podman[157152]: Feb 23 04:18:32 localhost podman[157152]: 2026-02-23 09:18:32.951392399 +0000 UTC m=+0.086312162 container create 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 23 04:18:32 localhost podman[157152]: 2026-02-23 09:18:32.911347734 +0000 UTC m=+0.046267517 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 23 04:18:32 localhost python3[157087]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Feb 23 04:18:34 localhost python3.9[157280]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:18:35 localhost python3.9[157374]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:36 localhost python3.9[157420]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:18:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32006 DF PROTO=TCP SPT=58964 DPT=9101 SEQ=2434591119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA7CC60000000001030307) Feb 23 04:18:36 localhost python3.9[157511]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838316.128913-1674-140198899223835/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:37 localhost python3.9[157557]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:18:37 localhost systemd[1]: Reloading. Feb 23 04:18:37 localhost systemd-rc-local-generator[157580]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:18:37 localhost systemd-sysv-generator[157583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:18:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:18:38 localhost python3.9[157639]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:18:38 localhost systemd[1]: Reloading. Feb 23 04:18:38 localhost systemd-rc-local-generator[157664]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:18:38 localhost systemd-sysv-generator[157669]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:18:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:18:38 localhost systemd[1]: Starting ovn_controller container... Feb 23 04:18:38 localhost systemd[1]: Started libcrun container. Feb 23 04:18:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e81b0c2abad543b5961626b9b2efedf2d0a2337c6f3b40a4800cdd043c8a8213/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Feb 23 04:18:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:18:38 localhost podman[157681]: 2026-02-23 09:18:38.809644232 +0000 UTC m=+0.160306519 container init 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, container_name=ovn_controller) Feb 23 04:18:38 localhost ovn_controller[157695]: + sudo -E kolla_set_configs Feb 23 04:18:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:18:38 localhost podman[157681]: 2026-02-23 09:18:38.8462758 +0000 UTC m=+0.196937987 container start 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:18:38 localhost edpm-start-podman-container[157681]: ovn_controller Feb 23 04:18:38 localhost systemd[1]: Created slice User Slice of UID 0. Feb 23 04:18:38 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Feb 23 04:18:38 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Feb 23 04:18:38 localhost systemd[1]: Starting User Manager for UID 0... Feb 23 04:18:38 localhost edpm-start-podman-container[157680]: Creating additional drop-in dependency for "ovn_controller" (83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc) Feb 23 04:18:38 localhost systemd[1]: Reloading. Feb 23 04:18:39 localhost podman[157703]: 2026-02-23 09:18:39.01629944 +0000 UTC m=+0.163833979 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0) Feb 23 04:18:39 localhost systemd[157729]: Queued start job for default target Main User Target. Feb 23 04:18:39 localhost systemd[157729]: Created slice User Application Slice. Feb 23 04:18:39 localhost systemd[157729]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Feb 23 04:18:39 localhost systemd[157729]: Started Daily Cleanup of User's Temporary Directories. Feb 23 04:18:39 localhost systemd[157729]: Reached target Paths. Feb 23 04:18:39 localhost systemd[157729]: Reached target Timers. Feb 23 04:18:39 localhost systemd[157729]: Starting D-Bus User Message Bus Socket... Feb 23 04:18:39 localhost systemd[157729]: Starting Create User's Volatile Files and Directories... Feb 23 04:18:39 localhost systemd-rc-local-generator[157782]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:18:39 localhost podman[157703]: 2026-02-23 09:18:39.056246241 +0000 UTC m=+0.203780760 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:18:39 localhost systemd[157729]: Finished Create User's Volatile Files and Directories. Feb 23 04:18:39 localhost systemd[157729]: Listening on D-Bus User Message Bus Socket. Feb 23 04:18:39 localhost systemd[157729]: Reached target Sockets. Feb 23 04:18:39 localhost systemd[157729]: Reached target Basic System. Feb 23 04:18:39 localhost systemd[157729]: Reached target Main User Target. Feb 23 04:18:39 localhost systemd[157729]: Startup finished in 118ms. Feb 23 04:18:39 localhost systemd-sysv-generator[157785]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:18:39 localhost podman[157703]: unhealthy Feb 23 04:18:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:18:39 localhost systemd[1]: Started User Manager for UID 0. Feb 23 04:18:39 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:18:39 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Failed with result 'exit-code'. Feb 23 04:18:39 localhost systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Feb 23 04:18:39 localhost systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 04:18:39 localhost systemd[1]: Started ovn_controller container. Feb 23 04:18:39 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:18:39 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:18:39 localhost systemd[1]: Started Session c12 of User root. Feb 23 04:18:39 localhost ovn_controller[157695]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:18:39 localhost ovn_controller[157695]: INFO:__main__:Validating config file Feb 23 04:18:39 localhost ovn_controller[157695]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:18:39 localhost ovn_controller[157695]: INFO:__main__:Writing out command to execute Feb 23 04:18:39 localhost systemd[1]: session-c12.scope: Deactivated successfully. Feb 23 04:18:39 localhost ovn_controller[157695]: ++ cat /run_command Feb 23 04:18:39 localhost ovn_controller[157695]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Feb 23 04:18:39 localhost ovn_controller[157695]: + ARGS= Feb 23 04:18:39 localhost ovn_controller[157695]: + sudo kolla_copy_cacerts Feb 23 04:18:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21854 DF PROTO=TCP SPT=45638 DPT=9882 SEQ=3727925378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA89130000000001030307) Feb 23 04:18:39 localhost systemd[1]: Started Session c13 of User root. Feb 23 04:18:39 localhost systemd[1]: session-c13.scope: Deactivated successfully. Feb 23 04:18:39 localhost ovn_controller[157695]: + [[ ! -n '' ]] Feb 23 04:18:39 localhost ovn_controller[157695]: + . kolla_extend_start Feb 23 04:18:39 localhost ovn_controller[157695]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Feb 23 04:18:39 localhost ovn_controller[157695]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\''' Feb 23 04:18:39 localhost ovn_controller[157695]: + umask 0022 Feb 23 04:18:39 localhost ovn_controller[157695]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8] Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00004|main|INFO|OVS IDL reconnected, force recompute. Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting... Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00011|features|INFO|OVS Feature: ct_flush, state: supported Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00013|main|INFO|OVS feature set changed, force recompute. Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00017|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00019|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00020|main|INFO|OVS OpenFlow connection reconnected,force recompute. Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00021|main|INFO|OVS feature set changed, force recompute. Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-5b0126-0 Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-585d62-0 Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-b9c72d-0 Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00026|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00027|binding|INFO|Claiming lport a27e5011-2016-4b16-b5e8-04b555b30bc4 for this chassis. Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00028|binding|INFO|a27e5011-2016-4b16-b5e8-04b555b30bc4: Claiming fa:16:3e:a0:9d:00 192.168.0.12 Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00029|binding|INFO|Removing lport a27e5011-2016-4b16-b5e8-04b555b30bc4 ovn-installed in OVS Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00030|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-5b0126-0 Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-585d62-0 Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00033|ovn_bfd|INFO|Enabled BFD on interface ovn-b9c72d-0 Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00034|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00035|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00036|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00037|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:39 localhost ovn_controller[157695]: 2026-02-23T09:18:39Z|00038|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:40 localhost ovn_controller[157695]: 2026-02-23T09:18:40Z|00039|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:40 localhost ovn_controller[157695]: 2026-02-23T09:18:40Z|00040|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:40 localhost ovn_controller[157695]: 2026-02-23T09:18:40Z|00041|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:40 localhost ovn_controller[157695]: 2026-02-23T09:18:40Z|00042|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:18:41 localhost python3.9[157895]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 23 04:18:42 localhost python3.9[157987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:18:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21856 DF PROTO=TCP SPT=45638 DPT=9882 SEQ=3727925378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEA95060000000001030307) Feb 23 04:18:42 localhost python3.9[158060]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838321.7459009-1809-270329035118453/.source.yaml _original_basename=.zyhhtvo1 follow=False checksum=181037f60084fed8e752a93376456c5747d0788c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:18:42 localhost sshd[158075]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:18:43 localhost python3.9[158153]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:18:43 localhost ovs-vsctl[158155]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Feb 23 04:18:45 localhost python3.9[158247]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:18:45 localhost ovs-vsctl[158249]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Feb 23 04:18:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52738 DF PROTO=TCP SPT=39394 DPT=9100 SEQ=1762409738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAA0060000000001030307) Feb 23 04:18:46 localhost python3.9[158342]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:18:46 localhost ovs-vsctl[158343]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Feb 23 04:18:47 localhost systemd[1]: session-49.scope: Deactivated successfully. Feb 23 04:18:47 localhost systemd-logind[759]: Session 49 logged out. Waiting for processes to exit. Feb 23 04:18:47 localhost systemd[1]: session-49.scope: Consumed 42.965s CPU time. Feb 23 04:18:47 localhost systemd-logind[759]: Removed session 49. Feb 23 04:18:47 localhost ovn_controller[157695]: 2026-02-23T09:18:47Z|00043|binding|INFO|Setting lport a27e5011-2016-4b16-b5e8-04b555b30bc4 ovn-installed in OVS Feb 23 04:18:47 localhost ovn_controller[157695]: 2026-02-23T09:18:47Z|00044|binding|INFO|Setting lport a27e5011-2016-4b16-b5e8-04b555b30bc4 up in Southbound Feb 23 04:18:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32008 DF PROTO=TCP SPT=58964 DPT=9101 SEQ=2434591119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAAC060000000001030307) Feb 23 04:18:49 localhost systemd[1]: Stopping User Manager for UID 0... Feb 23 04:18:49 localhost systemd[157729]: Activating special unit Exit the Session... Feb 23 04:18:49 localhost systemd[157729]: Stopped target Main User Target. Feb 23 04:18:49 localhost systemd[157729]: Stopped target Basic System. Feb 23 04:18:49 localhost systemd[157729]: Stopped target Paths. Feb 23 04:18:49 localhost systemd[157729]: Stopped target Sockets. Feb 23 04:18:49 localhost systemd[157729]: Stopped target Timers. Feb 23 04:18:49 localhost systemd[157729]: Stopped Daily Cleanup of User's Temporary Directories. Feb 23 04:18:49 localhost systemd[157729]: Closed D-Bus User Message Bus Socket. Feb 23 04:18:49 localhost systemd[157729]: Stopped Create User's Volatile Files and Directories. Feb 23 04:18:49 localhost systemd[157729]: Removed slice User Application Slice. Feb 23 04:18:49 localhost systemd[157729]: Reached target Shutdown. Feb 23 04:18:49 localhost systemd[157729]: Finished Exit the Session. Feb 23 04:18:49 localhost systemd[157729]: Reached target Exit the Session. Feb 23 04:18:49 localhost systemd[1]: user@0.service: Deactivated successfully. Feb 23 04:18:49 localhost systemd[1]: Stopped User Manager for UID 0. Feb 23 04:18:49 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Feb 23 04:18:49 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Feb 23 04:18:49 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Feb 23 04:18:49 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Feb 23 04:18:49 localhost systemd[1]: Removed slice User Slice of UID 0. Feb 23 04:18:52 localhost sshd[158360]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:18:52 localhost systemd-logind[759]: New session 51 of user zuul. Feb 23 04:18:52 localhost systemd[1]: Started Session 51 of User zuul. Feb 23 04:18:53 localhost python3.9[158453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:18:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17606 DF PROTO=TCP SPT=50204 DPT=9102 SEQ=1687700845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAC2900000000001030307) Feb 23 04:18:54 localhost python3.9[158549]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21858 DF PROTO=TCP SPT=45638 DPT=9882 SEQ=3727925378 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAC6070000000001030307) Feb 23 04:18:55 localhost python3.9[158641]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:56 localhost python3.9[158733]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17608 DF PROTO=TCP SPT=50204 DPT=9102 SEQ=1687700845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEACE860000000001030307) Feb 23 04:18:57 localhost python3.9[158826]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:58 localhost python3.9[158918]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:18:58 localhost python3.9[159008]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:18:59 localhost python3.9[159100]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Feb 23 04:18:59 localhost sshd[159101]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:19:00 localhost python3.9[159192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29948 DF PROTO=TCP SPT=47350 DPT=9100 SEQ=1493365353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEADD070000000001030307) Feb 23 04:19:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:19:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x557956c5c850#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Feb 23 04:19:01 localhost python3.9[159265]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838340.1822007-213-231750768156571/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:02 localhost python3.9[159355]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:02 localhost python3.9[159428]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838341.627566-258-219189195936682/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29949 DF PROTO=TCP SPT=47350 DPT=9100 SEQ=1493365353 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAE5060000000001030307) Feb 23 04:19:03 localhost python3.9[159520]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:19:04 localhost python3.9[159574]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:19:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21914 DF PROTO=TCP SPT=36452 DPT=9101 SEQ=3026775186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAF2060000000001030307) Feb 23 04:19:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:19:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564b561042d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_sl Feb 23 04:19:09 localhost python3.9[159668]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:19:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17610 DF PROTO=TCP SPT=50204 DPT=9102 SEQ=1687700845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEAFE060000000001030307) Feb 23 04:19:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:19:09 localhost systemd[1]: tmp-crun.27XSTS.mount: Deactivated successfully. Feb 23 04:19:09 localhost podman[159686]: 2026-02-23 09:19:09.913972698 +0000 UTC m=+0.089621934 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:19:09 localhost ovn_controller[157695]: 2026-02-23T09:19:09Z|00045|memory|INFO|18092 kB peak resident set size after 30.5 seconds Feb 23 04:19:09 localhost ovn_controller[157695]: 2026-02-23T09:19:09Z|00046|memory|INFO|idl-cells-OVN_Southbound:4072 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:80 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:348 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:157 ofctrl_installed_flow_usage-KB:114 ofctrl_sb_flow_ref_usage-KB:68 Feb 23 04:19:09 localhost podman[159686]: 2026-02-23 09:19:09.960539945 +0000 UTC m=+0.136189201 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ovn_controller) Feb 23 04:19:09 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:19:10 localhost python3.9[159800]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:11 localhost python3.9[159894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838350.370946-369-83221992807801/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:11 localhost python3.9[160009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:12 localhost python3.9[160095]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838351.3380702-369-88403242977912/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56548 DF PROTO=TCP SPT=49832 DPT=9882 SEQ=196626078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB0A470000000001030307) Feb 23 04:19:13 localhost python3.9[160185]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:13 localhost python3.9[160256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838353.0235553-501-270861747460134/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:14 localhost python3.9[160346]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31617 DF PROTO=TCP SPT=53314 DPT=9105 SEQ=1808066821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB14060000000001030307) Feb 23 04:19:15 localhost python3.9[160417]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838354.1198308-501-137196997278846/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:15 localhost python3.9[160507]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:19:16 localhost python3.9[160602]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:17 localhost python3.9[160694]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:17 localhost ovn_controller[157695]: 2026-02-23T09:19:17Z|00047|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory Feb 23 04:19:17 localhost python3.9[160742]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:18 localhost python3.9[160834]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21916 DF PROTO=TCP SPT=36452 DPT=9101 SEQ=3026775186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB22060000000001030307) Feb 23 04:19:19 localhost python3.9[160882]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:19 localhost python3.9[160974]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:21 localhost python3.9[161066]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:22 localhost python3.9[161114]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:22 localhost python3.9[161206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:23 localhost sshd[161236]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:19:23 localhost python3.9[161255]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59809 DF PROTO=TCP SPT=41464 DPT=9102 SEQ=2877943873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB37C00000000001030307) Feb 23 04:19:24 localhost python3.9[161348]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:19:24 localhost systemd[1]: Reloading. Feb 23 04:19:24 localhost systemd-rc-local-generator[161373]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:19:24 localhost systemd-sysv-generator[161377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:19:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:19:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56550 DF PROTO=TCP SPT=49832 DPT=9882 SEQ=196626078 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB3A070000000001030307) Feb 23 04:19:26 localhost python3.9[161478]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59811 DF PROTO=TCP SPT=41464 DPT=9102 SEQ=2877943873 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB43C60000000001030307) Feb 23 04:19:27 localhost python3.9[161526]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:27 localhost python3.9[161618]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:28 localhost python3.9[161666]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:29 localhost python3.9[161758]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:19:29 localhost systemd[1]: Reloading. Feb 23 04:19:29 localhost systemd-rc-local-generator[161781]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:19:29 localhost systemd-sysv-generator[161784]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:19:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:19:29 localhost systemd[1]: Starting Create netns directory... Feb 23 04:19:29 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 04:19:29 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 04:19:29 localhost systemd[1]: Finished Create netns directory. Feb 23 04:19:30 localhost python3.9[161891]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20706 DF PROTO=TCP SPT=52912 DPT=9100 SEQ=365865694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB52460000000001030307) Feb 23 04:19:32 localhost python3.9[161983]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:32 localhost python3.9[162056]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838371.558441-954-101878687577726/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20707 DF PROTO=TCP SPT=52912 DPT=9100 SEQ=365865694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB5A460000000001030307) Feb 23 04:19:34 localhost python3.9[162148]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:34 localhost python3.9[162240]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:19:35 localhost python3.9[162332]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:36 localhost python3.9[162407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838375.0409348-1053-130955197809679/.source.json _original_basename=.2fp_r1fz follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13589 DF PROTO=TCP SPT=57966 DPT=9101 SEQ=2521930306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB67060000000001030307) Feb 23 04:19:36 localhost python3.9[162497]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:38 localhost sshd[162705]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:19:38 localhost python3.9[162752]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Feb 23 04:19:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20427 DF PROTO=TCP SPT=55950 DPT=9882 SEQ=638005839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB73730000000001030307) Feb 23 04:19:39 localhost python3.9[162844]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 23 04:19:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:19:40 localhost podman[162937]: 2026-02-23 09:19:40.831510052 +0000 UTC m=+0.093288544 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 23 04:19:40 localhost podman[162937]: 2026-02-23 09:19:40.895304374 +0000 UTC m=+0.157082756 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:19:40 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:19:40 localhost python3[162936]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 23 04:19:41 localhost python3[162936]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "76bd0f417d97eed33ec13822a6468fff2a43c066ff0ef717fb226d4a1fc97b17",#012 "Digest": "sha256:0a8901bdd982c4ba62e40905edf375097daf8fd968b1839b56832f37354d5b07",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:0a8901bdd982c4ba62e40905edf375097daf8fd968b1839b56832f37354d5b07"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-02-23T06:26:05.098634295Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 784228609,#012 "VirtualSize": 784228609,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/4ca138c1babff33aa47b0f593cc672ab03770d4205069570de2d0e7691f07ed3/diff:/var/lib/containers/storage/overlay/7a6a75b4bc44910de031f240cbd770d29244a190eb01a1840ff2078eb2d894ad/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/5bf4078070f41854870417452ad68470796913522011b663ed0d8d22a6f27928/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/5bf4078070f41854870417452ad68470796913522011b663ed0d8d22a6f27928/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",#012 "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",#012 "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",#012 "sha256:28e68e9ecec07805a02cd85d7efe631108e3186cd82263ab9cb109564a3435f5",#012 "sha256:2c8b50875d9f0980f38972811e1dbbc8e64c448e40a8be21ff8837be00cf89ab",#012 "sha256:2782735a76d8db3e6692125b10fd55ced9f8590ef8ae6abf986ddc10f33757f4"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2026-02-17T01:25:07.246646992Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:07.246739119Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260216\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:12.132997501Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-02-23T06:08:39.081651802Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081666472Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081677733Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081688343Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081701553Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081710413Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.413481757Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:09:13.490649497Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.con Feb 23 04:19:41 localhost podman[163010]: 2026-02-23 09:19:41.35459389 +0000 UTC m=+0.070773118 container remove 9cbce0029944c609d539a721dd4f976ae4d067424b899bddefbfcdec4505ccf9 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'cf62475d9880911ecf982eff6ab572ad'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Feb 23 04:19:41 localhost python3[162936]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Feb 23 04:19:41 localhost podman[163023]: Feb 23 04:19:41 localhost podman[163023]: 2026-02-23 09:19:41.483412941 +0000 UTC m=+0.109285698 container create 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:19:41 localhost podman[163023]: 2026-02-23 09:19:41.418568147 +0000 UTC m=+0.044440964 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 23 04:19:41 localhost python3[162936]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Feb 23 04:19:42 localhost python3.9[163152]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:19:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20429 DF PROTO=TCP SPT=55950 DPT=9882 SEQ=638005839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB7F860000000001030307) Feb 23 04:19:43 localhost python3.9[163246]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:44 localhost python3.9[163292]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:19:44 localhost python3.9[163383]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838384.0895164-1287-153800842458946/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20709 DF PROTO=TCP SPT=52912 DPT=9100 SEQ=365865694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB8A060000000001030307) Feb 23 04:19:45 localhost python3.9[163429]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:19:45 localhost systemd[1]: Reloading. Feb 23 04:19:45 localhost systemd-rc-local-generator[163451]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:19:45 localhost systemd-sysv-generator[163455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:19:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:19:46 localhost python3.9[163511]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:19:46 localhost systemd[1]: Reloading. Feb 23 04:19:46 localhost systemd-rc-local-generator[163539]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:19:46 localhost systemd-sysv-generator[163542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:19:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:19:46 localhost systemd[1]: Starting ovn_metadata_agent container... Feb 23 04:19:46 localhost systemd[1]: Started libcrun container. Feb 23 04:19:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee1f92392988c110dc098475c8a3c73d1754101417a8a753bd7eeb8a3f0fc5f9/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 23 04:19:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee1f92392988c110dc098475c8a3c73d1754101417a8a753bd7eeb8a3f0fc5f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:19:46 localhost podman[163553]: 2026-02-23 09:19:46.855566102 +0000 UTC m=+0.179036515 container init 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0) Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + sudo -E kolla_set_configs Feb 23 04:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:19:46 localhost podman[163553]: 2026-02-23 09:19:46.914511433 +0000 UTC m=+0.237981856 container start 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:19:46 localhost edpm-start-podman-container[163553]: ovn_metadata_agent Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Validating config file Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Copying service configuration files Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Writing out command to execute Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: ++ cat /run_command Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + CMD=neutron-ovn-metadata-agent Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + ARGS= Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + sudo kolla_copy_cacerts Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + [[ ! -n '' ]] Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + . kolla_extend_start Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: Running command: 'neutron-ovn-metadata-agent' Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + umask 0022 Feb 23 04:19:46 localhost ovn_metadata_agent[163567]: + exec neutron-ovn-metadata-agent Feb 23 04:19:47 localhost podman[163575]: 2026-02-23 09:19:46.986049404 +0000 UTC m=+0.086230575 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Feb 23 04:19:47 localhost edpm-start-podman-container[163552]: Creating additional drop-in dependency for "ovn_metadata_agent" (11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739) Feb 23 04:19:47 localhost podman[163575]: 2026-02-23 09:19:47.083208737 +0000 UTC m=+0.183389908 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Feb 23 04:19:47 localhost systemd[1]: Reloading. Feb 23 04:19:47 localhost systemd-rc-local-generator[163637]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:19:47 localhost systemd-sysv-generator[163643]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:19:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:19:47 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:19:47 localhost systemd[1]: Started ovn_metadata_agent container. Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.486 163572 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.486 163572 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.486 163572 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.487 163572 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.488 163572 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.489 163572 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.490 163572 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.491 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.492 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.493 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.494 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.495 163572 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.496 163572 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.497 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.498 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.499 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.500 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.501 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.502 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.503 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.504 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.505 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.506 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.507 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.508 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.509 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.510 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.511 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.512 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.513 163572 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.514 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.515 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.516 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.517 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.518 163572 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.518 163572 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.570 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.570 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.570 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.571 163572 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.571 163572 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.586 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 96b5bb93-7341-4ce6-9b93-6a5de566c711 (UUID: 96b5bb93-7341-4ce6-9b93-6a5de566c711) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.601 163572 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.602 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.602 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.602 163572 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.604 163572 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.608 163572 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.615 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:9d:00 192.168.0.12'], port_security=['fa:16:3e:a0:9d:00 192.168.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.12/24', 'neutron:device_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005626463.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '18508c14-7c5f-4fc2-8d9a-66df41a4ab8c ef2f14d6-40b1-49a6-83d1-89d52b525905', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1694950-12d2-4254-85f1-37700098294d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a27e5011-2016-4b16-b5e8-04b555b30bc4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.616 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '96b5bb93-7341-4ce6-9b93-6a5de566c711'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': '1e311e03-9c9b-56ee-88f3-50a2fe78fcac', 'neutron:ovn-metadata-sb-cfg': '2'}, name=96b5bb93-7341-4ce6-9b93-6a5de566c711, nb_cfg_timestamp=1771838328073, nb_cfg=5) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.616 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a27e5011-2016-4b16-b5e8-04b555b30bc4 in datapath 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d bound to our chassis on insert#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.617 163572 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.617 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.617 163572 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.618 163572 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.618 163572 INFO oslo_service.service [-] Starting 1 workers#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.620 163572 DEBUG oslo_service.service [-] Started child 163670 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.623 163572 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9da5b53d-3184-450f-9a5b-bdba1a6c9f6d#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.624 163572 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpckuxnm6t/privsep.sock']#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.625 163670 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-184385'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.651 163670 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.652 163670 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.652 163670 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.657 163670 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.659 163670 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Feb 23 04:19:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:48.670 163670 INFO eventlet.wsgi.server [-] (163670) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Feb 23 04:19:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13591 DF PROTO=TCP SPT=57966 DPT=9101 SEQ=2521930306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEB98060000000001030307) Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.193 163572 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.194 163572 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpckuxnm6t/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.089 163675 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.094 163675 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.097 163675 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.097 163675 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163675#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.197 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[82ff1654-b311-4176-be73-37665b2ba583]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.688 163675 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.688 163675 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:19:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:49.688 163675 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:19:49 localhost python3.9[163755]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.145 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[74c77b82-e97c-4f54-afd9-f6f1283dd6a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.147 163572 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmprtzwo8ob/privsep.sock']#033[00m Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.757 163572 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.758 163572 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmprtzwo8ob/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.643 163808 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.650 163808 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.653 163808 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.653 163808 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163808#033[00m Feb 23 04:19:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:50.761 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6dcb06-42aa-415c-973e-1a05cd6ea9c4]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:51 localhost python3.9[163857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.220 163808 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.221 163808 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.221 163808 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:19:51 localhost python3.9[163933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838390.5905259-1422-136039156239963/.source.yaml _original_basename=.982kglsr follow=False checksum=5e9d1f3425ea21486875902a84faa4fb54cf7178 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.719 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[e07ebd5b-468f-4b4f-aab0-9c28dd8d9383]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.722 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[c8cf58e5-8fe7-456a-999d-9890843db5dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.744 163808 DEBUG oslo.privsep.daemon [-] privsep: reply[6424b2f5-9605-43a7-80e5-98dd66d6f7de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.763 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[ec2c6a0a-c4f3-46d6-ab52-39edfc07e4cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9da5b53d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:c8:0e:6f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7142, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 643129, 'reachable_time': 21618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 163953, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.779 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[cf130fa9-b99d-40fb-bb8e-03a72ce06639]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9da5b53d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 643138, 'tstamp': 643138}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163954, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tap9da5b53d-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 643140, 'tstamp': 643140}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163954, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 643136, 'tstamp': 643136}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163954, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec8:e6f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 643129, 'tstamp': 643129}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163954, 'error': None, 'target': 'ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.835 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[82b4f5d2-e9a2-4b70-bb61-9e4f17c472cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.837 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9da5b53d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.879 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9da5b53d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.880 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.880 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9da5b53d-30, col_values=(('external_ids', {'iface-id': '4143c8ea-7577-4792-9744-bcff90eb20f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.881 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 23 04:19:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:51.885 163572 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp8829ngjy/privsep.sock']#033[00m Feb 23 04:19:51 localhost systemd[1]: session-51.scope: Deactivated successfully. Feb 23 04:19:52 localhost systemd[1]: session-51.scope: Consumed 33.005s CPU time. Feb 23 04:19:52 localhost systemd-logind[759]: Session 51 logged out. Waiting for processes to exit. Feb 23 04:19:52 localhost systemd-logind[759]: Removed session 51. Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.517 163572 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.518 163572 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp8829ngjy/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.415 163964 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.421 163964 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.424 163964 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.425 163964 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163964#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.522 163964 DEBUG oslo.privsep.daemon [-] privsep: reply[75176f92-edb7-421a-9017-c446b8c74dad]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.912 163964 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.913 163964 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:19:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:52.913 163964 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.359 163964 DEBUG oslo.privsep.daemon [-] privsep: reply[c8029fb2-f7c3-4ea1-ba1e-5468d89c8f5a]: (4, ['ovnmeta-9da5b53d-3184-450f-9a5b-bdba1a6c9f6d']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.363 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, column=external_ids, values=({'neutron:ovn-metadata-id': '1e311e03-9c9b-56ee-88f3-50a2fe78fcac'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.363 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.364 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.377 163572 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.378 163572 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.378 163572 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.378 163572 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.378 163572 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.379 163572 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.379 163572 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.379 163572 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.379 163572 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.380 163572 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.380 163572 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.380 163572 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.380 163572 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.381 163572 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.382 163572 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.382 163572 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.382 163572 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.382 163572 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.383 163572 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.383 163572 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.383 163572 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.383 163572 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.384 163572 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.384 163572 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.384 163572 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.384 163572 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.385 163572 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.385 163572 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.385 163572 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.385 163572 DEBUG oslo_service.service [-] host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.386 163572 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.386 163572 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.386 163572 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.386 163572 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.387 163572 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.387 163572 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.387 163572 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.387 163572 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.388 163572 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.389 163572 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.390 163572 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.390 163572 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.390 163572 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.390 163572 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.391 163572 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.392 163572 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.392 163572 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.392 163572 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.392 163572 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.393 163572 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.393 163572 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.393 163572 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.393 163572 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.394 163572 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.395 163572 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.395 163572 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.395 163572 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.395 163572 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.396 163572 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.397 163572 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.398 163572 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.398 163572 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.398 163572 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.398 163572 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.399 163572 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.400 163572 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.401 163572 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.401 163572 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.402 163572 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.402 163572 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.402 163572 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.402 163572 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.403 163572 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.404 163572 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.404 163572 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.404 163572 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.404 163572 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.405 163572 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.405 163572 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.405 163572 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.405 163572 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.406 163572 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.406 163572 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.406 163572 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.406 163572 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.407 163572 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.407 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.407 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.407 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.408 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.408 163572 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.408 163572 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.408 163572 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.409 163572 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.410 163572 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.410 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.410 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.410 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.411 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.412 163572 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.413 163572 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.414 163572 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.415 163572 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.416 163572 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.417 163572 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.418 163572 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.418 163572 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.418 163572 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.418 163572 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.419 163572 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.420 163572 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.421 163572 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.421 163572 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.421 163572 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.421 163572 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.422 163572 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.423 163572 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.424 163572 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.425 163572 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.426 163572 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.427 163572 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.428 163572 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.429 163572 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.430 163572 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.431 163572 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.432 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.433 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.434 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.435 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.436 163572 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:19:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:19:53.437 163572 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 23 04:19:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56263 DF PROTO=TCP SPT=50442 DPT=9102 SEQ=1965221192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBACF00000000001030307) Feb 23 04:19:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20431 DF PROTO=TCP SPT=55950 DPT=9882 SEQ=638005839 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBB0060000000001030307) Feb 23 04:19:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56265 DF PROTO=TCP SPT=50442 DPT=9102 SEQ=1965221192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBB9460000000001030307) Feb 23 04:19:58 localhost sshd[163969]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:19:58 localhost systemd-logind[759]: New session 52 of user zuul. Feb 23 04:19:58 localhost systemd[1]: Started Session 52 of User zuul. Feb 23 04:19:59 localhost python3.9[164062]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:20:00 localhost python3.9[164158]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:00 localhost sshd[164183]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:20:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37824 DF PROTO=TCP SPT=45092 DPT=9100 SEQ=961740688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBC7870000000001030307) Feb 23 04:20:01 localhost python3.9[164265]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:01 localhost systemd[1]: tmp-crun.x18woI.mount: Deactivated successfully. Feb 23 04:20:01 localhost systemd[1]: libpod-3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74.scope: Deactivated successfully. Feb 23 04:20:01 localhost podman[164266]: 2026-02-23 09:20:01.54765599 +0000 UTC m=+0.083597434 container died 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13) Feb 23 04:20:01 localhost podman[164266]: 2026-02-23 09:20:01.59358603 +0000 UTC m=+0.129527474 container cleanup 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:31:49Z, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team) Feb 23 04:20:01 localhost podman[164280]: 2026-02-23 09:20:01.64244578 +0000 UTC m=+0.087175506 container remove 3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, io.openshift.expose-services=) Feb 23 04:20:01 localhost systemd[1]: libpod-conmon-3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74.scope: Deactivated successfully. Feb 23 04:20:02 localhost systemd[1]: var-lib-containers-storage-overlay-0f5369acf1913ef4c00375204a8528c500963efed1d6cd27d7b10a2d16e203b5-merged.mount: Deactivated successfully. Feb 23 04:20:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3df457bc42eb89aa6185b88a77d4c5934a55cee63ff7da200319cff5193cbe74-userdata-shm.mount: Deactivated successfully. Feb 23 04:20:02 localhost python3.9[164387]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:20:02 localhost systemd[1]: Reloading. Feb 23 04:20:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37825 DF PROTO=TCP SPT=45092 DPT=9100 SEQ=961740688 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBCF860000000001030307) Feb 23 04:20:02 localhost systemd-rc-local-generator[164413]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:20:02 localhost systemd-sysv-generator[164416]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:20:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:20:03 localhost python3.9[164512]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:20:03 localhost network[164529]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:20:03 localhost network[164530]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:20:03 localhost network[164531]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:20:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:20:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15488 DF PROTO=TCP SPT=46708 DPT=9101 SEQ=3722656640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBDC460000000001030307) Feb 23 04:20:09 localhost python3.9[164732]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:20:09 localhost systemd[1]: Reloading. Feb 23 04:20:09 localhost systemd-sysv-generator[164759]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:20:09 localhost systemd-rc-local-generator[164751]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:20:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:20:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16153 DF PROTO=TCP SPT=49138 DPT=9882 SEQ=959314150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBE8A30000000001030307) Feb 23 04:20:09 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Feb 23 04:20:10 localhost python3.9[164863]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:20:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:20:11 localhost podman[164957]: 2026-02-23 09:20:11.656303803 +0000 UTC m=+0.093948805 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216) Feb 23 04:20:11 localhost podman[164957]: 2026-02-23 09:20:11.694511813 +0000 UTC m=+0.132156845 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:20:11 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:20:11 localhost python3.9[164956]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:20:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16155 DF PROTO=TCP SPT=49138 DPT=9882 SEQ=959314150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEBF4C60000000001030307) Feb 23 04:20:12 localhost python3.9[165100]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:20:14 localhost python3.9[165241]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:20:15 localhost python3.9[165334]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:20:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21996 DF PROTO=TCP SPT=43090 DPT=9105 SEQ=1258698052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC00060000000001030307) Feb 23 04:20:15 localhost python3.9[165427]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:20:16 localhost sshd[165443]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:20:17 localhost podman[165477]: 2026-02-23 09:20:17.887164154 +0000 UTC m=+0.063868846 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 04:20:17 localhost podman[165477]: 2026-02-23 09:20:17.893270912 +0000 UTC m=+0.069975584 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:20:17 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:20:18 localhost python3.9[165539]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15490 DF PROTO=TCP SPT=46708 DPT=9101 SEQ=3722656640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC0C070000000001030307) Feb 23 04:20:18 localhost python3.9[165631]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:19 localhost python3.9[165723]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:20 localhost python3.9[165815]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:20 localhost python3.9[165907]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:21 localhost python3.9[165999]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:22 localhost python3.9[166091]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:23 localhost python3.9[166183]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58851 DF PROTO=TCP SPT=56256 DPT=9102 SEQ=1936941513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC22200000000001030307) Feb 23 04:20:24 localhost python3.9[166275]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16157 DF PROTO=TCP SPT=49138 DPT=9882 SEQ=959314150 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC24060000000001030307) Feb 23 04:20:24 localhost python3.9[166367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:25 localhost python3.9[166459]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:26 localhost python3.9[166551]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:26 localhost python3.9[166643]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58853 DF PROTO=TCP SPT=56256 DPT=9102 SEQ=1936941513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC2E470000000001030307) Feb 23 04:20:27 localhost python3.9[166735]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:20:28 localhost python3.9[166827]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:29 localhost python3.9[166919]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 23 04:20:29 localhost python3.9[167011]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:20:29 localhost systemd[1]: Reloading. Feb 23 04:20:30 localhost systemd-sysv-generator[167037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:20:30 localhost systemd-rc-local-generator[167033]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:20:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:20:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53614 DF PROTO=TCP SPT=43070 DPT=9100 SEQ=426322492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC3C860000000001030307) Feb 23 04:20:30 localhost python3.9[167139]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:31 localhost python3.9[167232]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:32 localhost python3.9[167325]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:32 localhost python3.9[167418]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53615 DF PROTO=TCP SPT=43070 DPT=9100 SEQ=426322492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC44860000000001030307) Feb 23 04:20:33 localhost python3.9[167511]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:34 localhost python3.9[167604]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25537 DF PROTO=TCP SPT=39442 DPT=9101 SEQ=990721497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC51860000000001030307) Feb 23 04:20:36 localhost sshd[167620]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:20:36 localhost python3.9[167698]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:20:38 localhost python3.9[167792]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Feb 23 04:20:38 localhost python3.9[167885]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 23 04:20:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11812 DF PROTO=TCP SPT=51890 DPT=9882 SEQ=1220898106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC5DD30000000001030307) Feb 23 04:20:39 localhost sshd[167938]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:20:39 localhost python3.9[167985]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Feb 23 04:20:41 localhost python3.9[168085]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:20:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:20:41 localhost podman[168122]: 2026-02-23 09:20:41.913499081 +0000 UTC m=+0.081169869 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, config_id=ovn_controller) Feb 23 04:20:41 localhost podman[168122]: 2026-02-23 09:20:41.951587621 +0000 UTC m=+0.119258469 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 04:20:41 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:20:42 localhost python3.9[168152]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:20:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11814 DF PROTO=TCP SPT=51890 DPT=9882 SEQ=1220898106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC69C70000000001030307) Feb 23 04:20:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53617 DF PROTO=TCP SPT=43070 DPT=9100 SEQ=426322492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC74060000000001030307) Feb 23 04:20:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:20:48.520 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:20:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:20:48.521 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:20:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:20:48.523 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:20:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25539 DF PROTO=TCP SPT=39442 DPT=9101 SEQ=990721497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC82060000000001030307) Feb 23 04:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:20:48 localhost sshd[168242]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:20:48 localhost podman[168231]: 2026-02-23 09:20:48.93708936 +0000 UTC m=+0.111240642 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 04:20:48 localhost podman[168231]: 2026-02-23 09:20:48.968656956 +0000 UTC m=+0.142808268 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:20:48 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:20:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17276 DF PROTO=TCP SPT=55316 DPT=9102 SEQ=1337298762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC974F0000000001030307) Feb 23 04:20:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11816 DF PROTO=TCP SPT=51890 DPT=9882 SEQ=1220898106 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEC9A070000000001030307) Feb 23 04:20:55 localhost sshd[168255]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:20:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17278 DF PROTO=TCP SPT=55316 DPT=9102 SEQ=1337298762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECA3470000000001030307) Feb 23 04:21:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43961 DF PROTO=TCP SPT=41620 DPT=9100 SEQ=1106125900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECB1C60000000001030307) Feb 23 04:21:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43962 DF PROTO=TCP SPT=41620 DPT=9100 SEQ=1106125900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECB9C60000000001030307) Feb 23 04:21:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8066 DF PROTO=TCP SPT=54296 DPT=9101 SEQ=3613526108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECC6C60000000001030307) Feb 23 04:21:08 localhost kernel: SELinux: Converting 2759 SID table entries... Feb 23 04:21:08 localhost kernel: SELinux: Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped). Feb 23 04:21:08 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 04:21:08 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 04:21:08 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 04:21:08 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 04:21:08 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 04:21:08 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 04:21:08 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 04:21:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56533 DF PROTO=TCP SPT=47668 DPT=9882 SEQ=948237871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECD3020000000001030307) Feb 23 04:21:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56535 DF PROTO=TCP SPT=47668 DPT=9882 SEQ=948237871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECDF060000000001030307) Feb 23 04:21:12 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=19 res=1 Feb 23 04:21:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:21:12 localhost systemd[1]: tmp-crun.jNkysz.mount: Deactivated successfully. Feb 23 04:21:12 localhost podman[169328]: 2026-02-23 09:21:12.929596836 +0000 UTC m=+0.094096593 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 04:21:12 localhost podman[169328]: 2026-02-23 09:21:12.977298391 +0000 UTC m=+0.141798108 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216) Feb 23 04:21:12 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:21:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43964 DF PROTO=TCP SPT=41620 DPT=9100 SEQ=1106125900 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECEA060000000001030307) Feb 23 04:21:15 localhost sshd[169422]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:21:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8068 DF PROTO=TCP SPT=54296 DPT=9101 SEQ=3613526108 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BECF6060000000001030307) Feb 23 04:21:18 localhost kernel: SELinux: Converting 2762 SID table entries... Feb 23 04:21:19 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 04:21:19 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 04:21:19 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 04:21:19 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 04:21:19 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 04:21:19 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 04:21:19 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 04:21:19 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=20 res=1 Feb 23 04:21:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:21:19 localhost systemd[1]: tmp-crun.r0HF94.mount: Deactivated successfully. Feb 23 04:21:19 localhost podman[169449]: 2026-02-23 09:21:19.955531714 +0000 UTC m=+0.111532861 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:21:19 localhost podman[169449]: 2026-02-23 09:21:19.985443062 +0000 UTC m=+0.141444129 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, managed_by=edpm_ansible, tcib_managed=true) Feb 23 04:21:19 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:21:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25007 DF PROTO=TCP SPT=48294 DPT=9102 SEQ=1925478040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED0C7F0000000001030307) Feb 23 04:21:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56537 DF PROTO=TCP SPT=47668 DPT=9882 SEQ=948237871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED10060000000001030307) Feb 23 04:21:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25009 DF PROTO=TCP SPT=48294 DPT=9102 SEQ=1925478040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED18860000000001030307) Feb 23 04:21:29 localhost kernel: SELinux: Converting 2765 SID table entries... Feb 23 04:21:29 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 04:21:29 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 04:21:29 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 04:21:29 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 04:21:29 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 04:21:29 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 04:21:29 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 04:21:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24876 DF PROTO=TCP SPT=43368 DPT=9100 SEQ=4035075529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED27060000000001030307) Feb 23 04:21:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24877 DF PROTO=TCP SPT=43368 DPT=9100 SEQ=4035075529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED2F060000000001030307) Feb 23 04:21:33 localhost sshd[169480]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:21:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3862 DF PROTO=TCP SPT=59578 DPT=9101 SEQ=3952865216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED3BC60000000001030307) Feb 23 04:21:37 localhost kernel: SELinux: Converting 2765 SID table entries... Feb 23 04:21:37 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 04:21:37 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 04:21:37 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 04:21:37 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 04:21:37 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 04:21:37 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 04:21:37 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 04:21:38 localhost systemd[1]: Reloading. Feb 23 04:21:38 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=22 res=1 Feb 23 04:21:38 localhost systemd-rc-local-generator[169509]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:21:38 localhost systemd-sysv-generator[169518]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:21:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:21:38 localhost systemd[1]: Reloading. Feb 23 04:21:38 localhost systemd-rc-local-generator[169552]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:21:38 localhost systemd-sysv-generator[169555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:21:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:21:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25011 DF PROTO=TCP SPT=48294 DPT=9102 SEQ=1925478040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED48060000000001030307) Feb 23 04:21:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16364 DF PROTO=TCP SPT=34588 DPT=9882 SEQ=3469396630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED54460000000001030307) Feb 23 04:21:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:21:43 localhost podman[169570]: 2026-02-23 09:21:43.933144033 +0000 UTC m=+0.096526864 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller) Feb 23 04:21:44 localhost podman[169570]: 2026-02-23 09:21:44.014200787 +0000 UTC m=+0.177583618 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:21:44 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:21:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22553 DF PROTO=TCP SPT=39002 DPT=9105 SEQ=3661566410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED5E060000000001030307) Feb 23 04:21:47 localhost kernel: SELinux: Converting 2766 SID table entries... Feb 23 04:21:47 localhost kernel: SELinux: policy capability network_peer_controls=1 Feb 23 04:21:47 localhost kernel: SELinux: policy capability open_perms=1 Feb 23 04:21:47 localhost kernel: SELinux: policy capability extended_socket_class=1 Feb 23 04:21:47 localhost kernel: SELinux: policy capability always_check_network=0 Feb 23 04:21:47 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Feb 23 04:21:47 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 23 04:21:47 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Feb 23 04:21:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:21:48.522 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:21:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:21:48.523 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:21:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:21:48.525 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:21:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3864 DF PROTO=TCP SPT=59578 DPT=9101 SEQ=3952865216 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED6C060000000001030307) Feb 23 04:21:49 localhost sshd[169643]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:21:50 localhost dbus-broker-launch[754]: avc: op=load_policy lsm=selinux seqno=23 res=1 Feb 23 04:21:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:21:50 localhost podman[169647]: 2026-02-23 09:21:50.959545125 +0000 UTC m=+0.147809282 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:21:50 localhost podman[169647]: 2026-02-23 09:21:50.993383093 +0000 UTC m=+0.181647250 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:21:51 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:21:51 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 23 04:21:51 localhost dbus-broker-launch[750]: Noticed file-system modification, trigger reload. Feb 23 04:21:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10718 DF PROTO=TCP SPT=59078 DPT=9102 SEQ=2991786459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED81B00000000001030307) Feb 23 04:21:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16366 DF PROTO=TCP SPT=34588 DPT=9882 SEQ=3469396630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED84060000000001030307) Feb 23 04:21:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10720 DF PROTO=TCP SPT=59078 DPT=9102 SEQ=2991786459 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED8DC60000000001030307) Feb 23 04:22:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2967 DF PROTO=TCP SPT=59764 DPT=9100 SEQ=3098478177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BED9C460000000001030307) Feb 23 04:22:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2968 DF PROTO=TCP SPT=59764 DPT=9100 SEQ=3098478177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDA4460000000001030307) Feb 23 04:22:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10913 DF PROTO=TCP SPT=33190 DPT=9101 SEQ=2169928675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDB1060000000001030307) Feb 23 04:22:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11696 DF PROTO=TCP SPT=50478 DPT=9882 SEQ=3211331624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDBD630000000001030307) Feb 23 04:22:11 localhost sshd[174582]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:22:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11698 DF PROTO=TCP SPT=50478 DPT=9882 SEQ=3211331624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDC9860000000001030307) Feb 23 04:22:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:22:14 localhost podman[177260]: 2026-02-23 09:22:14.929427063 +0000 UTC m=+0.095252929 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 23 04:22:14 localhost podman[177260]: 2026-02-23 09:22:14.994733321 +0000 UTC m=+0.160559177 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible) Feb 23 04:22:15 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:22:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23919 DF PROTO=TCP SPT=59430 DPT=9105 SEQ=2478478734 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDD4070000000001030307) Feb 23 04:22:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10915 DF PROTO=TCP SPT=33190 DPT=9101 SEQ=2169928675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDE2060000000001030307) Feb 23 04:22:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:22:21 localhost podman[182164]: 2026-02-23 09:22:21.909821323 +0000 UTC m=+0.081858571 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:22:21 localhost podman[182164]: 2026-02-23 09:22:21.941863924 +0000 UTC m=+0.113901192 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent) Feb 23 04:22:22 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:22:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65527 DF PROTO=TCP SPT=54138 DPT=9102 SEQ=3681896413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDF6E10000000001030307) Feb 23 04:22:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11700 DF PROTO=TCP SPT=50478 DPT=9882 SEQ=3211331624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEDFA070000000001030307) Feb 23 04:22:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65529 DF PROTO=TCP SPT=54138 DPT=9102 SEQ=3681896413 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE03060000000001030307) Feb 23 04:22:28 localhost sshd[186755]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:22:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8466 DF PROTO=TCP SPT=39038 DPT=9100 SEQ=1332103946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE11460000000001030307) Feb 23 04:22:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8467 DF PROTO=TCP SPT=39038 DPT=9100 SEQ=1332103946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE19460000000001030307) Feb 23 04:22:34 localhost systemd[1]: Stopping OpenSSH server daemon... Feb 23 04:22:34 localhost systemd[1]: sshd.service: Deactivated successfully. Feb 23 04:22:34 localhost systemd[1]: Stopped OpenSSH server daemon. Feb 23 04:22:34 localhost systemd[1]: sshd.service: Consumed 2.617s CPU time, read 32.0K from disk, written 0B to disk. Feb 23 04:22:34 localhost systemd[1]: Stopped target sshd-keygen.target. Feb 23 04:22:34 localhost systemd[1]: Stopping sshd-keygen.target... Feb 23 04:22:34 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:22:34 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:22:34 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Feb 23 04:22:34 localhost systemd[1]: Reached target sshd-keygen.target. Feb 23 04:22:34 localhost systemd[1]: Starting OpenSSH server daemon... Feb 23 04:22:34 localhost sshd[187670]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:22:34 localhost systemd[1]: Started OpenSSH server daemon. Feb 23 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:34 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:36 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:36 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55417 DF PROTO=TCP SPT=48002 DPT=9101 SEQ=1767193969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE26470000000001030307) Feb 23 04:22:36 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 04:22:36 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 04:22:36 localhost systemd[1]: Reloading. Feb 23 04:22:36 localhost systemd-rc-local-generator[188133]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:36 localhost systemd-sysv-generator[188139]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:36 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:36 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:37 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:37 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:37 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:37 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 04:22:37 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 04:22:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21951 DF PROTO=TCP SPT=38008 DPT=9882 SEQ=4285250185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE32930000000001030307) Feb 23 04:22:40 localhost python3.9[192593]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:22:40 localhost systemd[1]: Reloading. Feb 23 04:22:41 localhost systemd-sysv-generator[192910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:41 localhost systemd-rc-local-generator[192906]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:41 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21953 DF PROTO=TCP SPT=38008 DPT=9882 SEQ=4285250185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE3E860000000001030307) Feb 23 04:22:42 localhost python3.9[194073]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:22:43 localhost systemd[1]: Reloading. Feb 23 04:22:43 localhost systemd-rc-local-generator[194165]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:43 localhost systemd-sysv-generator[194171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:43 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:44 localhost python3.9[194559]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:22:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:22:45 localhost systemd[1]: Reloading. Feb 23 04:22:45 localhost podman[195150]: 2026-02-23 09:22:45.269435214 +0000 UTC m=+0.110032897 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:22:45 localhost systemd-rc-local-generator[195248]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:45 localhost systemd-sysv-generator[195254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:45 localhost podman[195150]: 2026-02-23 09:22:45.336637348 +0000 UTC m=+0.177235031 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8469 DF PROTO=TCP SPT=39038 DPT=9100 SEQ=1332103946 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE4A060000000001030307) Feb 23 04:22:45 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:22:46 localhost python3.9[195663]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:22:46 localhost systemd[1]: Reloading. Feb 23 04:22:46 localhost systemd-rc-local-generator[195929]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:46 localhost systemd-sysv-generator[195935]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:47 localhost python3.9[196360]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:22:47 localhost systemd[1]: Reloading. Feb 23 04:22:47 localhost systemd-rc-local-generator[196588]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:47 localhost systemd-sysv-generator[196593]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:47 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55419 DF PROTO=TCP SPT=48002 DPT=9101 SEQ=1767193969 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE56060000000001030307) Feb 23 04:22:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:22:48.523 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:22:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:22:48.524 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:22:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:22:48.526 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:22:48 localhost python3.9[196997]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:22:49 localhost sshd[197368]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:22:49 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 04:22:49 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 04:22:49 localhost systemd[1]: man-db-cache-update.service: Consumed 15.645s CPU time. Feb 23 04:22:49 localhost systemd[1]: run-ra83127cecaa34d288cf51053ffd142d5.service: Deactivated successfully. Feb 23 04:22:49 localhost systemd[1]: run-rd1915ba4ef78472e8e99bcebfa8e98ea.service: Deactivated successfully. Feb 23 04:22:49 localhost systemd[1]: Reloading. Feb 23 04:22:49 localhost systemd-sysv-generator[197513]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:49 localhost systemd-rc-local-generator[197508]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:50 localhost python3.9[197632]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:22:51 localhost systemd[1]: Reloading. Feb 23 04:22:51 localhost systemd-rc-local-generator[197656]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:51 localhost systemd-sysv-generator[197661]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:52 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:22:52 localhost podman[197672]: 2026-02-23 09:22:52.30053506 +0000 UTC m=+0.087245901 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:22:52 localhost podman[197672]: 2026-02-23 09:22:52.310167372 +0000 UTC m=+0.096878183 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:22:52 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:22:53 localhost python3.9[197800]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:22:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60293 DF PROTO=TCP SPT=36928 DPT=9102 SEQ=261598781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE6C110000000001030307) Feb 23 04:22:54 localhost python3.9[197913]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:22:54 localhost systemd[1]: Reloading. Feb 23 04:22:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21955 DF PROTO=TCP SPT=38008 DPT=9882 SEQ=4285250185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE6E060000000001030307) Feb 23 04:22:54 localhost systemd-sysv-generator[197944]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:54 localhost systemd-rc-local-generator[197937]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:54 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:56 localhost python3.9[198061]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:22:56 localhost systemd[1]: Reloading. Feb 23 04:22:57 localhost systemd-rc-local-generator[198091]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:22:57 localhost systemd-sysv-generator[198094]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:22:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60295 DF PROTO=TCP SPT=36928 DPT=9102 SEQ=261598781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE78060000000001030307) Feb 23 04:22:58 localhost python3.9[198210]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:22:59 localhost python3.9[198323]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49530 DF PROTO=TCP SPT=47062 DPT=9100 SEQ=2074974366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE86860000000001030307) Feb 23 04:23:01 localhost python3.9[198436]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:01 localhost python3.9[198549]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:02 localhost python3.9[198662]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49531 DF PROTO=TCP SPT=47062 DPT=9100 SEQ=2074974366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE8E860000000001030307) Feb 23 04:23:03 localhost python3.9[198775]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:05 localhost python3.9[198888]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52918 DF PROTO=TCP SPT=52242 DPT=9101 SEQ=5356550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEE9B860000000001030307) Feb 23 04:23:06 localhost sshd[199002]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:23:06 localhost python3.9[199001]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:07 localhost python3.9[199116]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:08 localhost python3.9[199229]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18811 DF PROTO=TCP SPT=56594 DPT=9882 SEQ=3746675600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEA7C30000000001030307) Feb 23 04:23:10 localhost python3.9[199342]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:12 localhost python3.9[199455]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18813 DF PROTO=TCP SPT=56594 DPT=9882 SEQ=3746675600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEB3C60000000001030307) Feb 23 04:23:13 localhost python3.9[199568]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49533 DF PROTO=TCP SPT=47062 DPT=9100 SEQ=2074974366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEBE060000000001030307) Feb 23 04:23:15 localhost python3.9[199681]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Feb 23 04:23:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:23:15 localhost podman[199683]: 2026-02-23 09:23:15.716176991 +0000 UTC m=+0.089032893 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:23:15 localhost podman[199683]: 2026-02-23 09:23:15.757248541 +0000 UTC m=+0.130104423 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 23 04:23:15 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:23:17 localhost python3.9[199820]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:23:17 localhost python3.9[199930]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:23:18 localhost python3.9[200040]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:23:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52920 DF PROTO=TCP SPT=52242 DPT=9101 SEQ=5356550 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEECC060000000001030307) Feb 23 04:23:19 localhost python3.9[200150]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:23:19 localhost python3.9[200296]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:23:20 localhost python3.9[200437]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:23:21 localhost python3.9[200563]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:23:22 localhost python3.9[200673]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:23:22 localhost systemd[1]: tmp-crun.kLUHDd.mount: Deactivated successfully. Feb 23 04:23:22 localhost podman[200764]: 2026-02-23 09:23:22.90405021 +0000 UTC m=+0.104294219 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:23:22 localhost podman[200764]: 2026-02-23 09:23:22.915233547 +0000 UTC m=+0.115477536 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Feb 23 04:23:22 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:23:22 localhost python3.9[200763]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838601.5410128-1662-47737091287228/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:23 localhost python3.9[200889]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44016 DF PROTO=TCP SPT=34410 DPT=9102 SEQ=368432506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEE1400000000001030307) Feb 23 04:23:24 localhost python3.9[200979]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838603.132028-1662-67851893940561/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18815 DF PROTO=TCP SPT=56594 DPT=9882 SEQ=3746675600 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEE4070000000001030307) Feb 23 04:23:24 localhost python3.9[201089]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:25 localhost python3.9[201179]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838604.4266741-1662-114599274264969/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:26 localhost python3.9[201289]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:26 localhost python3.9[201379]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838605.8342266-1662-280878966173408/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44018 DF PROTO=TCP SPT=34410 DPT=9102 SEQ=368432506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEED460000000001030307) Feb 23 04:23:27 localhost python3.9[201489]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:27 localhost sshd[201580]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:23:28 localhost python3.9[201579]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838607.0073864-1662-145197105874780/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:29 localhost python3.9[201691]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:29 localhost python3.9[201781]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838608.6292882-1662-192434915117389/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:30 localhost python3.9[201891]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1707 DF PROTO=TCP SPT=37970 DPT=9100 SEQ=1028638576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEEFBC60000000001030307) Feb 23 04:23:31 localhost python3.9[201979]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838609.7980475-1662-49307972051331/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:31 localhost python3.9[202089]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:32 localhost python3.9[202179]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771838611.3456676-1662-155137844101064/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1708 DF PROTO=TCP SPT=37970 DPT=9100 SEQ=1028638576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF03C60000000001030307) Feb 23 04:23:33 localhost python3.9[202289]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:33 localhost python3.9[202399]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:34 localhost python3.9[202509]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:35 localhost python3.9[202619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:35 localhost python3.9[202729]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38834 DF PROTO=TCP SPT=34776 DPT=9101 SEQ=1059110977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF10860000000001030307) Feb 23 04:23:36 localhost python3.9[202839]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:36 localhost python3.9[202949]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:37 localhost python3.9[203059]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:38 localhost python3.9[203169]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:38 localhost python3.9[203279]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:39 localhost python3.9[203389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30480 DF PROTO=TCP SPT=50418 DPT=9882 SEQ=3914433687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF1CF50000000001030307) Feb 23 04:23:40 localhost python3.9[203499]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:41 localhost python3.9[203609]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:41 localhost python3.9[203719]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30482 DF PROTO=TCP SPT=50418 DPT=9882 SEQ=3914433687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF29060000000001030307) Feb 23 04:23:43 localhost python3.9[203829]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:43 localhost python3.9[203939]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:44 localhost python3.9[204027]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838623.462256-2325-214771761873346/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:44 localhost sshd[204138]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:23:45 localhost python3.9[204137]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1710 DF PROTO=TCP SPT=37970 DPT=9100 SEQ=1028638576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF34060000000001030307) Feb 23 04:23:45 localhost python3.9[204227]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838624.6299677-2325-10760413131730/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:23:45 localhost podman[204245]: 2026-02-23 09:23:45.917966369 +0000 UTC m=+0.086633982 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Feb 23 04:23:45 localhost systemd[1]: tmp-crun.48Y5Wu.mount: Deactivated successfully. Feb 23 04:23:45 localhost podman[204245]: 2026-02-23 09:23:45.965567931 +0000 UTC m=+0.134235524 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216) Feb 23 04:23:45 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:23:46 localhost python3.9[204362]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:46 localhost python3.9[204450]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838625.8519723-2325-278676601107263/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:47 localhost python3.9[204560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:47 localhost python3.9[204648]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838626.9844291-2325-2824619806686/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38836 DF PROTO=TCP SPT=34776 DPT=9101 SEQ=1059110977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF40060000000001030307) Feb 23 04:23:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:23:48.523 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:23:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:23:48.524 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:23:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:23:48.526 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:23:48 localhost python3.9[204758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:49 localhost python3.9[204846]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838628.1109343-2325-279520773670372/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:49 localhost python3.9[204956]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:50 localhost python3.9[205044]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838629.3342123-2325-71050521864360/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:51 localhost python3.9[205154]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:51 localhost python3.9[205242]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838630.5485175-2325-277528593880385/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:52 localhost python3.9[205352]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:23:53 localhost podman[205441]: 2026-02-23 09:23:53.381722553 +0000 UTC m=+0.098914323 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:23:53 localhost podman[205441]: 2026-02-23 09:23:53.393176784 +0000 UTC m=+0.110368544 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Feb 23 04:23:53 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:23:53 localhost python3.9[205440]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838631.7737117-2325-91894078544770/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25892 DF PROTO=TCP SPT=33876 DPT=9102 SEQ=3029223417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF566F0000000001030307) Feb 23 04:23:54 localhost python3.9[205567]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30484 DF PROTO=TCP SPT=50418 DPT=9882 SEQ=3914433687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF5A070000000001030307) Feb 23 04:23:55 localhost python3.9[205655]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838633.6122322-2325-1913487218414/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:55 localhost python3.9[205765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:56 localhost python3.9[205853]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838635.4920456-2325-134004433001527/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:57 localhost python3.9[205963]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25894 DF PROTO=TCP SPT=33876 DPT=9102 SEQ=3029223417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF62860000000001030307) Feb 23 04:23:57 localhost python3.9[206051]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838636.6617522-2325-205131813969326/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:58 localhost python3.9[206161]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:23:59 localhost python3.9[206249]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838637.865976-2325-59996288339509/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:23:59 localhost python3.9[206359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:00 localhost python3.9[206447]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838639.2778425-2325-23532228047330/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:00 localhost python3.9[206557]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63804 DF PROTO=TCP SPT=49832 DPT=9100 SEQ=3024133794 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF71060000000001030307) Feb 23 04:24:01 localhost python3.9[206645]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838640.3888333-2325-42499097915428/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:02 localhost python3.9[206753]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:24:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63805 DF PROTO=TCP SPT=49832 DPT=9100 SEQ=3024133794 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF79060000000001030307) Feb 23 04:24:03 localhost python3.9[206866]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Feb 23 04:24:04 localhost python3.9[206976]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:24:04 localhost systemd[1]: Reloading. Feb 23 04:24:04 localhost systemd-rc-local-generator[207001]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:24:04 localhost systemd-sysv-generator[207006]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:04 localhost systemd[1]: Starting libvirt logging daemon socket... Feb 23 04:24:04 localhost systemd[1]: Listening on libvirt logging daemon socket. Feb 23 04:24:04 localhost systemd[1]: Starting libvirt logging daemon admin socket... Feb 23 04:24:04 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Feb 23 04:24:04 localhost systemd[1]: Starting libvirt logging daemon... Feb 23 04:24:04 localhost systemd[1]: Started libvirt logging daemon. Feb 23 04:24:06 localhost python3.9[207128]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:24:06 localhost systemd[1]: Reloading. Feb 23 04:24:06 localhost systemd-rc-local-generator[207152]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:24:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43480 DF PROTO=TCP SPT=49158 DPT=9101 SEQ=3915859774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF85C60000000001030307) Feb 23 04:24:06 localhost systemd-sysv-generator[207155]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:06 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:06 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 23 04:24:06 localhost systemd[1]: Starting libvirt nodedev daemon socket... Feb 23 04:24:06 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Feb 23 04:24:06 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Feb 23 04:24:06 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Feb 23 04:24:06 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Feb 23 04:24:06 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Feb 23 04:24:06 localhost systemd[1]: Started libvirt nodedev daemon. Feb 23 04:24:06 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 23 04:24:06 localhost setroubleshoot[207166]: Deleting alert a176daef-12e3-44f0-9641-bedf749d0981, it is allowed in current policy Feb 23 04:24:06 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Feb 23 04:24:07 localhost python3.9[207311]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:24:07 localhost systemd[1]: Reloading. Feb 23 04:24:07 localhost systemd-sysv-generator[207343]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:24:07 localhost systemd-rc-local-generator[207338]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:07 localhost systemd[1]: Starting libvirt proxy daemon socket... Feb 23 04:24:07 localhost systemd[1]: Listening on libvirt proxy daemon socket. Feb 23 04:24:07 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Feb 23 04:24:07 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Feb 23 04:24:07 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Feb 23 04:24:07 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Feb 23 04:24:07 localhost systemd[1]: Started libvirt proxy daemon. Feb 23 04:24:07 localhost setroubleshoot[207166]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 217d9264-3142-4991-af77-108d418c8753 Feb 23 04:24:07 localhost setroubleshoot[207166]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Feb 23 04:24:07 localhost setroubleshoot[207166]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 217d9264-3142-4991-af77-108d418c8753 Feb 23 04:24:07 localhost setroubleshoot[207166]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Feb 23 04:24:07 localhost sshd[207453]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:24:08 localhost python3.9[207487]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:24:08 localhost systemd[1]: Reloading. Feb 23 04:24:08 localhost systemd-sysv-generator[207517]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:24:08 localhost systemd-rc-local-generator[207511]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:08 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:08 localhost systemd[1]: Listening on libvirt locking daemon socket. Feb 23 04:24:08 localhost systemd[1]: Starting libvirt QEMU daemon socket... Feb 23 04:24:08 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Feb 23 04:24:08 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Feb 23 04:24:08 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Feb 23 04:24:08 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Feb 23 04:24:08 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Feb 23 04:24:08 localhost systemd[1]: Started libvirt QEMU daemon. Feb 23 04:24:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25896 DF PROTO=TCP SPT=33876 DPT=9102 SEQ=3029223417 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF92070000000001030307) Feb 23 04:24:09 localhost python3.9[207669]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:24:09 localhost systemd[1]: Reloading. Feb 23 04:24:09 localhost systemd-rc-local-generator[207702]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:24:09 localhost systemd-sysv-generator[207709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:09 localhost systemd[1]: Starting libvirt secret daemon socket... Feb 23 04:24:09 localhost systemd[1]: Listening on libvirt secret daemon socket. Feb 23 04:24:09 localhost systemd[1]: Starting libvirt secret daemon admin socket... Feb 23 04:24:09 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Feb 23 04:24:09 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Feb 23 04:24:09 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Feb 23 04:24:09 localhost systemd[1]: Started libvirt secret daemon. Feb 23 04:24:11 localhost python3.9[207852]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:11 localhost python3.9[207962]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 23 04:24:12 localhost python3.9[208072]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:24:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12732 DF PROTO=TCP SPT=41654 DPT=9882 SEQ=4265702725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEF9E460000000001030307) Feb 23 04:24:13 localhost python3.9[208184]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 23 04:24:14 localhost python3.9[208292]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:14 localhost python3.9[208378]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838653.633911-3189-164760666317036/.source.xml follow=False _original_basename=secret.xml.j2 checksum=9110e86c46036bf6b9c9b3a9e049196c9a537971 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:14 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10338 DF PROTO=TCP SPT=40756 DPT=9105 SEQ=3078679058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFA8060000000001030307) Feb 23 04:24:15 localhost python3.9[208488]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine f1fea371-cb69-578d-a3d0-b5c472a84b46#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:24:16 localhost python3.9[208608]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:24:16 localhost podman[208719]: 2026-02-23 09:24:16.672753632 +0000 UTC m=+0.089666714 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:24:16 localhost podman[208719]: 2026-02-23 09:24:16.71022013 +0000 UTC m=+0.127133192 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.license=GPLv2) Feb 23 04:24:16 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:24:17 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Feb 23 04:24:17 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 23 04:24:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43482 DF PROTO=TCP SPT=49158 DPT=9101 SEQ=3915859774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFB6060000000001030307) Feb 23 04:24:19 localhost python3.9[208971]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:19 localhost python3.9[209081]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:20 localhost python3.9[209169]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838659.4437988-3354-271272366115848/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:21 localhost sshd[209223]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:24:21 localhost python3.9[209358]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:21 localhost systemd[1]: tmp-crun.iyeqxf.mount: Deactivated successfully. Feb 23 04:24:21 localhost podman[209389]: 2026-02-23 09:24:21.897135662 +0000 UTC m=+0.098747059 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2) Feb 23 04:24:22 localhost podman[209389]: 2026-02-23 09:24:22.023433612 +0000 UTC m=+0.225044999 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux , version=7, name=rhceph, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:24:22 localhost python3.9[209566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:23 localhost python3.9[209674]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:24:23 localhost podman[209802]: 2026-02-23 09:24:23.622263636 +0000 UTC m=+0.087095873 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:24:23 localhost podman[209802]: 2026-02-23 09:24:23.628758819 +0000 UTC m=+0.093591126 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:24:23 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:24:23 localhost python3.9[209801]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12312 DF PROTO=TCP SPT=41600 DPT=9102 SEQ=1440645661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFCBA00000000001030307) Feb 23 04:24:24 localhost python3.9[209893]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qssxrmrn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12734 DF PROTO=TCP SPT=41654 DPT=9882 SEQ=4265702725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFCE060000000001030307) Feb 23 04:24:24 localhost python3.9[210003]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:25 localhost python3.9[210060]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:26 localhost python3.9[210170]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:24:26 localhost python3[210281]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 23 04:24:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12314 DF PROTO=TCP SPT=41600 DPT=9102 SEQ=1440645661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFD7C70000000001030307) Feb 23 04:24:27 localhost python3.9[210391]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:28 localhost python3.9[210448]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:28 localhost python3.9[210558]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:29 localhost python3.9[210648]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838668.3358357-3621-160336285168455/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:30 localhost python3.9[210758]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:30 localhost python3.9[210815]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6740 DF PROTO=TCP SPT=39358 DPT=9100 SEQ=3465392032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFE6060000000001030307) Feb 23 04:24:31 localhost python3.9[210925]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:31 localhost python3.9[210982]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:32 localhost python3.9[211092]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6741 DF PROTO=TCP SPT=39358 DPT=9100 SEQ=3465392032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFEE060000000001030307) Feb 23 04:24:33 localhost python3.9[211182]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771838672.0535352-3738-176109576668582/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:34 localhost python3.9[211292]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:34 localhost python3.9[211402]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:24:35 localhost python3.9[211515]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18593 DF PROTO=TCP SPT=56764 DPT=9101 SEQ=2878013049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BEFFB060000000001030307) Feb 23 04:24:36 localhost python3.9[211625]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:24:36 localhost python3.9[211736]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:24:37 localhost python3.9[211848]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:24:39 localhost python3.9[211961]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52456 DF PROTO=TCP SPT=50762 DPT=9882 SEQ=3167604538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0075B0000000001030307) Feb 23 04:24:39 localhost python3.9[212071]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:40 localhost python3.9[212159]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838679.3368304-3954-98544778554840/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:41 localhost python3.9[212269]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:42 localhost python3.9[212357]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838681.0290236-3999-281213604590977/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52458 DF PROTO=TCP SPT=50762 DPT=9882 SEQ=3167604538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF013470000000001030307) Feb 23 04:24:42 localhost python3.9[212467]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:24:43 localhost python3.9[212555]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838682.2423844-4044-43166436884493/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:24:44 localhost python3.9[212665]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:24:44 localhost systemd[1]: Reloading. Feb 23 04:24:44 localhost systemd-rc-local-generator[212694]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:24:44 localhost systemd-sysv-generator[212697]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:44 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:44 localhost systemd[1]: Reached target edpm_libvirt.target. Feb 23 04:24:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8888 DF PROTO=TCP SPT=58522 DPT=9105 SEQ=2047497438 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF01E060000000001030307) Feb 23 04:24:45 localhost python3.9[212816]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Feb 23 04:24:45 localhost systemd[1]: Reloading. Feb 23 04:24:45 localhost systemd-sysv-generator[212845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:24:45 localhost systemd-rc-local-generator[212837]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: Reloading. Feb 23 04:24:45 localhost systemd-rc-local-generator[212875]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:24:45 localhost systemd-sysv-generator[212880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:24:46 localhost systemd[1]: session-52.scope: Deactivated successfully. Feb 23 04:24:46 localhost systemd[1]: session-52.scope: Consumed 3min 28.951s CPU time. Feb 23 04:24:46 localhost systemd-logind[759]: Session 52 logged out. Waiting for processes to exit. Feb 23 04:24:46 localhost systemd-logind[759]: Removed session 52. Feb 23 04:24:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:24:46 localhost podman[212906]: 2026-02-23 09:24:46.923613021 +0000 UTC m=+0.093857648 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:24:46 localhost podman[212906]: 2026-02-23 09:24:46.968193387 +0000 UTC m=+0.138438034 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:24:46 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:24:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:24:48.524 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:24:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:24:48.525 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:24:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:24:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:24:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18595 DF PROTO=TCP SPT=56764 DPT=9101 SEQ=2878013049 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF02C070000000001030307) Feb 23 04:24:48 localhost sshd[212929]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:24:52 localhost sshd[212931]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:24:52 localhost systemd-logind[759]: New session 53 of user zuul. Feb 23 04:24:52 localhost systemd[1]: Started Session 53 of User zuul. Feb 23 04:24:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:24:53 localhost python3.9[213042]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:24:53 localhost podman[213043]: 2026-02-23 09:24:53.91850168 +0000 UTC m=+0.084801320 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:24:53 localhost podman[213043]: 2026-02-23 09:24:53.958336823 +0000 UTC m=+0.124636433 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 04:24:53 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:24:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46915 DF PROTO=TCP SPT=41742 DPT=9102 SEQ=1607567423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF040D00000000001030307) Feb 23 04:24:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52460 DF PROTO=TCP SPT=50762 DPT=9882 SEQ=3167604538 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF044070000000001030307) Feb 23 04:24:55 localhost python3.9[213172]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:24:55 localhost network[213189]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:24:55 localhost network[213190]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:24:55 localhost network[213191]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:24:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:24:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46917 DF PROTO=TCP SPT=41742 DPT=9102 SEQ=1607567423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF04CC60000000001030307) Feb 23 04:25:00 localhost python3.9[213423]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:25:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41264 DF PROTO=TCP SPT=35660 DPT=9100 SEQ=2940103577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF05B460000000001030307) Feb 23 04:25:01 localhost python3.9[213486]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:25:02 localhost sshd[213489]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:25:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41265 DF PROTO=TCP SPT=35660 DPT=9100 SEQ=2940103577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF063460000000001030307) Feb 23 04:25:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11836 DF PROTO=TCP SPT=37330 DPT=9101 SEQ=2381435964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF070460000000001030307) Feb 23 04:25:07 localhost sshd[213491]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:25:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46919 DF PROTO=TCP SPT=41742 DPT=9102 SEQ=1607567423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF07C060000000001030307) Feb 23 04:25:09 localhost python3.9[213602]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:25:10 localhost python3.9[213714]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:11 localhost python3.9[213824]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:25:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29761 DF PROTO=TCP SPT=35032 DPT=9882 SEQ=2357979186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF088860000000001030307) Feb 23 04:25:12 localhost python3.9[213935]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:25:13 localhost python3.9[214046]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:25:14 localhost python3.9[214157]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:25:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41267 DF PROTO=TCP SPT=35660 DPT=9100 SEQ=2940103577 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF094060000000001030307) Feb 23 04:25:15 localhost python3.9[214269]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:17 localhost python3.9[214379]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:25:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:25:17 localhost systemd[1]: Listening on Open-iSCSI iscsid Socket. Feb 23 04:25:17 localhost podman[214381]: 2026-02-23 09:25:17.306717199 +0000 UTC m=+0.079761052 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2) Feb 23 04:25:17 localhost podman[214381]: 2026-02-23 09:25:17.34423734 +0000 UTC m=+0.117281173 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS) Feb 23 04:25:17 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:25:18 localhost python3.9[214519]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:25:18 localhost systemd[1]: Reloading. Feb 23 04:25:18 localhost systemd-sysv-generator[214551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:25:18 localhost systemd-rc-local-generator[214546]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11838 DF PROTO=TCP SPT=37330 DPT=9101 SEQ=2381435964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0A0060000000001030307) Feb 23 04:25:18 localhost systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi). Feb 23 04:25:18 localhost systemd[1]: Starting Open-iSCSI... Feb 23 04:25:18 localhost iscsid[214560]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Feb 23 04:25:18 localhost iscsid[214560]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Feb 23 04:25:18 localhost iscsid[214560]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Feb 23 04:25:18 localhost iscsid[214560]: If using hardware iscsi like qla4xxx this message can be ignored. Feb 23 04:25:18 localhost iscsid[214560]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Feb 23 04:25:18 localhost iscsid[214560]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Feb 23 04:25:18 localhost iscsid[214560]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf Feb 23 04:25:18 localhost systemd[1]: Started Open-iSCSI. Feb 23 04:25:18 localhost systemd[1]: Starting Logout off all iSCSI sessions on shutdown... Feb 23 04:25:18 localhost systemd[1]: Finished Logout off all iSCSI sessions on shutdown. Feb 23 04:25:19 localhost python3.9[214669]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:25:19 localhost network[214686]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:25:19 localhost network[214687]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:25:19 localhost network[214688]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:25:20 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Feb 23 04:25:20 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Feb 23 04:25:20 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service. Feb 23 04:25:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l a9a31abe-c7d7-40fe-a004-19f43a8d4e45 Feb 23 04:25:21 localhost setroubleshoot[214702]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Feb 23 04:25:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:25:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24904 DF PROTO=TCP SPT=50328 DPT=9102 SEQ=3199228888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0B6000000000001030307) Feb 23 04:25:24 localhost systemd[1]: tmp-crun.h7q49F.mount: Deactivated successfully. Feb 23 04:25:24 localhost podman[214881]: 2026-02-23 09:25:24.126925504 +0000 UTC m=+0.105861044 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:25:24 localhost podman[214881]: 2026-02-23 09:25:24.137309851 +0000 UTC m=+0.116245351 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:25:24 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:25:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29763 DF PROTO=TCP SPT=35032 DPT=9882 SEQ=2357979186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0B8070000000001030307) Feb 23 04:25:25 localhost python3.9[215023]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:25:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24906 DF PROTO=TCP SPT=50328 DPT=9102 SEQ=3199228888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0C2060000000001030307) Feb 23 04:25:29 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 04:25:29 localhost sshd[215058]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:25:29 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 04:25:29 localhost systemd[1]: Reloading. Feb 23 04:25:29 localhost systemd-rc-local-generator[215084]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:25:29 localhost systemd-sysv-generator[215089]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:29 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 04:25:29 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 04:25:29 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 04:25:29 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 04:25:29 localhost systemd[1]: run-r637ba52ec0f147179484851958911c07.service: Deactivated successfully. Feb 23 04:25:29 localhost systemd[1]: run-r4d04a739c1c1418bb00674c5d773d907.service: Deactivated successfully. Feb 23 04:25:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27134 DF PROTO=TCP SPT=41448 DPT=9100 SEQ=1097254809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0D0860000000001030307) Feb 23 04:25:31 localhost python3.9[215333]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 23 04:25:31 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully. Feb 23 04:25:31 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Feb 23 04:25:32 localhost python3.9[215443]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Feb 23 04:25:32 localhost python3.9[215557]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:25:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27135 DF PROTO=TCP SPT=41448 DPT=9100 SEQ=1097254809 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0D8860000000001030307) Feb 23 04:25:33 localhost python3.9[215645]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838732.435389-483-233868046033386/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:34 localhost python3.9[215755]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:35 localhost python3.9[215865]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:25:35 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 23 04:25:35 localhost systemd[1]: Stopped Load Kernel Modules. Feb 23 04:25:35 localhost systemd[1]: Stopping Load Kernel Modules... Feb 23 04:25:35 localhost systemd[1]: Starting Load Kernel Modules... Feb 23 04:25:35 localhost systemd-modules-load[215869]: Module 'msr' is built in Feb 23 04:25:35 localhost systemd[1]: Finished Load Kernel Modules. Feb 23 04:25:36 localhost python3.9[215979]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:25:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36623 DF PROTO=TCP SPT=52888 DPT=9101 SEQ=1907568863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0E5460000000001030307) Feb 23 04:25:36 localhost python3.9[216090]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:25:37 localhost python3.9[216200]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:25:38 localhost python3.9[216288]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838737.1081872-636-103485383052129/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62953 DF PROTO=TCP SPT=58364 DPT=9882 SEQ=3578246340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0F1B30000000001030307) Feb 23 04:25:39 localhost python3.9[216398]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:25:40 localhost python3.9[216509]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:40 localhost sshd[216527]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:25:41 localhost python3.9[216621]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62955 DF PROTO=TCP SPT=58364 DPT=9882 SEQ=3578246340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF0FDC60000000001030307) Feb 23 04:25:42 localhost python3.9[216731]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:43 localhost python3.9[216841]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:43 localhost python3.9[216951]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:44 localhost python3.9[217061]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18967 DF PROTO=TCP SPT=46600 DPT=9105 SEQ=3736887681 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF108060000000001030307) Feb 23 04:25:45 localhost python3.9[217171]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:45 localhost python3.9[217281]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:25:46 localhost python3.9[217393]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:25:47 localhost python3.9[217504]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:25:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:25:47 localhost systemd[1]: Listening on multipathd control socket. Feb 23 04:25:47 localhost systemd[1]: tmp-crun.eEaAPC.mount: Deactivated successfully. Feb 23 04:25:47 localhost podman[217506]: 2026-02-23 09:25:47.543099143 +0000 UTC m=+0.080569687 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Feb 23 04:25:47 localhost podman[217506]: 2026-02-23 09:25:47.656253415 +0000 UTC m=+0.193723939 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0) Feb 23 04:25:47 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:25:48 localhost python3.9[217642]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:25:48 localhost systemd[1]: Starting Wait for udev To Complete Device Initialization... Feb 23 04:25:48 localhost udevadm[217647]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in. Feb 23 04:25:48 localhost systemd[1]: Finished Wait for udev To Complete Device Initialization. Feb 23 04:25:48 localhost systemd[1]: Starting Device-Mapper Multipath Device Controller... Feb 23 04:25:48 localhost multipathd[217650]: --------start up-------- Feb 23 04:25:48 localhost multipathd[217650]: read /etc/multipath.conf Feb 23 04:25:48 localhost multipathd[217650]: path checkers start up Feb 23 04:25:48 localhost systemd[1]: Started Device-Mapper Multipath Device Controller. Feb 23 04:25:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:25:48.525 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:25:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:25:48.526 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:25:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:25:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:25:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36625 DF PROTO=TCP SPT=52888 DPT=9101 SEQ=1907568863 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF116060000000001030307) Feb 23 04:25:49 localhost python3.9[217768]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 23 04:25:50 localhost python3.9[217878]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Feb 23 04:25:50 localhost python3.9[217997]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:25:51 localhost python3.9[218085]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838750.4665916-1026-127961187653858/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:53 localhost python3.9[218195]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:25:54 localhost python3.9[218305]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:25:54 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 23 04:25:54 localhost systemd[1]: Stopped Load Kernel Modules. Feb 23 04:25:54 localhost systemd[1]: Stopping Load Kernel Modules... Feb 23 04:25:54 localhost systemd[1]: Starting Load Kernel Modules... Feb 23 04:25:54 localhost systemd-modules-load[218309]: Module 'msr' is built in Feb 23 04:25:54 localhost systemd[1]: Finished Load Kernel Modules. Feb 23 04:25:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4965 DF PROTO=TCP SPT=42684 DPT=9102 SEQ=3732269252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF12B6A0000000001030307) Feb 23 04:25:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:25:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62957 DF PROTO=TCP SPT=58364 DPT=9882 SEQ=3578246340 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF12E060000000001030307) Feb 23 04:25:54 localhost podman[218327]: 2026-02-23 09:25:54.916236253 +0000 UTC m=+0.090919733 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 04:25:54 localhost podman[218327]: 2026-02-23 09:25:54.926272579 +0000 UTC m=+0.100956069 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:25:54 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:25:55 localhost python3.9[218436]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:25:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4967 DF PROTO=TCP SPT=42684 DPT=9102 SEQ=3732269252 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF137860000000001030307) Feb 23 04:25:59 localhost systemd[1]: Reloading. Feb 23 04:25:59 localhost systemd-rc-local-generator[218469]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:25:59 localhost systemd-sysv-generator[218475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: Reloading. Feb 23 04:25:59 localhost systemd-rc-local-generator[218505]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:25:59 localhost systemd-sysv-generator[218508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:25:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button) Feb 23 04:26:00 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Feb 23 04:26:00 localhost lvm[218557]: PV /dev/loop3 online, VG ceph_vg0 is complete. Feb 23 04:26:00 localhost lvm[218557]: VG ceph_vg0 finished Feb 23 04:26:00 localhost lvm[218558]: PV /dev/loop4 online, VG ceph_vg1 is complete. Feb 23 04:26:00 localhost lvm[218558]: VG ceph_vg1 finished Feb 23 04:26:00 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Feb 23 04:26:00 localhost systemd[1]: Starting man-db-cache-update.service... Feb 23 04:26:00 localhost systemd[1]: Reloading. Feb 23 04:26:00 localhost systemd-rc-local-generator[218610]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:26:00 localhost systemd-sysv-generator[218613]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:00 localhost systemd[1]: Queuing reload/restart jobs for marked units… Feb 23 04:26:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60195 DF PROTO=TCP SPT=50476 DPT=9100 SEQ=4137352120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF145C60000000001030307) Feb 23 04:26:01 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Feb 23 04:26:01 localhost systemd[1]: Finished man-db-cache-update.service. Feb 23 04:26:01 localhost systemd[1]: man-db-cache-update.service: Consumed 1.320s CPU time. Feb 23 04:26:01 localhost systemd[1]: run-rf9480d67471c41c2ad5bf3fba9700aef.service: Deactivated successfully. Feb 23 04:26:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60196 DF PROTO=TCP SPT=50476 DPT=9100 SEQ=4137352120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF14DC60000000001030307) Feb 23 04:26:03 localhost python3.9[219868]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:26:03 localhost systemd[1]: Stopping Device-Mapper Multipath Device Controller... Feb 23 04:26:03 localhost multipathd[217650]: exit (signal) Feb 23 04:26:03 localhost multipathd[217650]: --------shut down------- Feb 23 04:26:03 localhost systemd[1]: multipathd.service: Deactivated successfully. Feb 23 04:26:03 localhost systemd[1]: Stopped Device-Mapper Multipath Device Controller. Feb 23 04:26:03 localhost systemd[1]: Starting Device-Mapper Multipath Device Controller... Feb 23 04:26:03 localhost multipathd[219874]: --------start up-------- Feb 23 04:26:03 localhost multipathd[219874]: read /etc/multipath.conf Feb 23 04:26:03 localhost multipathd[219874]: path checkers start up Feb 23 04:26:03 localhost systemd[1]: Started Device-Mapper Multipath Device Controller. Feb 23 04:26:04 localhost python3.9[219990]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:26:05 localhost python3.9[220104]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7311 DF PROTO=TCP SPT=57178 DPT=9101 SEQ=1163680637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF15A860000000001030307) Feb 23 04:26:06 localhost systemd[1]: virtnodedevd.service: Deactivated successfully. Feb 23 04:26:07 localhost python3.9[220215]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:26:07 localhost systemd[1]: Reloading. Feb 23 04:26:07 localhost systemd-rc-local-generator[220240]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:26:07 localhost systemd-sysv-generator[220244]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:07 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Feb 23 04:26:08 localhost python3.9[220360]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:26:08 localhost network[220377]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:26:08 localhost network[220378]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:26:08 localhost network[220379]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:26:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60317 DF PROTO=TCP SPT=43352 DPT=9882 SEQ=795698491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF166E30000000001030307) Feb 23 04:26:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:26:11 localhost sshd[220496]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:26:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60319 DF PROTO=TCP SPT=43352 DPT=9882 SEQ=795698491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF173060000000001030307) Feb 23 04:26:13 localhost python3.9[220614]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:26:13 localhost python3.9[220725]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:26:14 localhost python3.9[220836]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:26:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65463 DF PROTO=TCP SPT=38604 DPT=9105 SEQ=108143549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF17E060000000001030307) Feb 23 04:26:15 localhost python3.9[220947]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:26:16 localhost python3.9[221058]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:26:17 localhost python3.9[221169]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:26:17 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Feb 23 04:26:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:26:17 localhost systemd[1]: tmp-crun.87wvyv.mount: Deactivated successfully. Feb 23 04:26:17 localhost podman[221282]: 2026-02-23 09:26:17.927948752 +0000 UTC m=+0.094215601 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:26:17 localhost python3.9[221281]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:26:18 localhost podman[221282]: 2026-02-23 09:26:18.006455876 +0000 UTC m=+0.172722745 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:26:18 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:26:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7313 DF PROTO=TCP SPT=57178 DPT=9101 SEQ=1163680637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF18A060000000001030307) Feb 23 04:26:19 localhost python3.9[221420]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:26:21 localhost sshd[221439]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:26:22 localhost python3.9[221533]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:23 localhost python3.9[221643]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:23 localhost python3.9[221753]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=713 DF PROTO=TCP SPT=49604 DPT=9102 SEQ=1882404235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1A0600000000001030307) Feb 23 04:26:24 localhost python3.9[221863]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:24 localhost python3.9[221973]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60321 DF PROTO=TCP SPT=43352 DPT=9882 SEQ=795698491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1A4070000000001030307) Feb 23 04:26:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:26:25 localhost podman[222083]: 2026-02-23 09:26:25.444274982 +0000 UTC m=+0.089731419 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:26:25 localhost podman[222083]: 2026-02-23 09:26:25.48119374 +0000 UTC m=+0.126650177 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 23 04:26:25 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:26:25 localhost python3.9[222089]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:26 localhost python3.9[222260]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:27 localhost python3.9[222405]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=715 DF PROTO=TCP SPT=49604 DPT=9102 SEQ=1882404235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1AC860000000001030307) Feb 23 04:26:27 localhost python3.9[222515]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:28 localhost python3.9[222625]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:29 localhost python3.9[222735]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:30 localhost python3.9[222845]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14552 DF PROTO=TCP SPT=37658 DPT=9100 SEQ=446064436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1BAC60000000001030307) Feb 23 04:26:31 localhost python3.9[222955]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:31 localhost python3.9[223065]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:32 localhost python3.9[223175]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14553 DF PROTO=TCP SPT=37658 DPT=9100 SEQ=446064436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1C2C60000000001030307) Feb 23 04:26:33 localhost python3.9[223285]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:26:33 localhost python3.9[223395]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:34 localhost python3.9[223505]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 23 04:26:35 localhost python3.9[223615]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:26:35 localhost systemd[1]: Reloading. Feb 23 04:26:35 localhost systemd-rc-local-generator[223641]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:26:35 localhost systemd-sysv-generator[223646]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:26:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64713 DF PROTO=TCP SPT=51780 DPT=9101 SEQ=1252174237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1CFC60000000001030307) Feb 23 04:26:36 localhost python3.9[223761]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:37 localhost python3.9[223872]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=717 DF PROTO=TCP SPT=49604 DPT=9102 SEQ=1882404235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1DC060000000001030307) Feb 23 04:26:39 localhost python3.9[223983]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:40 localhost python3.9[224094]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:41 localhost python3.9[224205]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:41 localhost python3.9[224316]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7105 DF PROTO=TCP SPT=52182 DPT=9882 SEQ=3071906528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1E8070000000001030307) Feb 23 04:26:42 localhost python3.9[224427]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:43 localhost python3.9[224538]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:26:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14555 DF PROTO=TCP SPT=37658 DPT=9100 SEQ=446064436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF1F2060000000001030307) Feb 23 04:26:46 localhost python3.9[224649]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:47 localhost python3.9[224759]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:26:48 localhost systemd[1]: tmp-crun.Xe47G1.mount: Deactivated successfully. Feb 23 04:26:48 localhost podman[224869]: 2026-02-23 09:26:48.230532726 +0000 UTC m=+0.106095976 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible) Feb 23 04:26:48 localhost podman[224869]: 2026-02-23 09:26:48.30733741 +0000 UTC m=+0.182900640 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0) Feb 23 04:26:48 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:26:48 localhost python3.9[224870]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:26:48.526 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:26:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:26:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:26:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:26:48.529 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:26:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64715 DF PROTO=TCP SPT=51780 DPT=9101 SEQ=1252174237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF200060000000001030307) Feb 23 04:26:48 localhost python3.9[225003]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:49 localhost python3.9[225113]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:50 localhost python3.9[225223]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:51 localhost python3.9[225333]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:52 localhost sshd[225389]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:26:52 localhost python3.9[225445]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:53 localhost python3.9[225555]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:26:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57105 DF PROTO=TCP SPT=44898 DPT=9102 SEQ=3126667153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF215900000000001030307) Feb 23 04:26:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7107 DF PROTO=TCP SPT=52182 DPT=9882 SEQ=3071906528 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF218060000000001030307) Feb 23 04:26:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:26:55 localhost podman[225573]: 2026-02-23 09:26:55.913916223 +0000 UTC m=+0.083457423 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent) Feb 23 04:26:55 localhost podman[225573]: 2026-02-23 09:26:55.943072989 +0000 UTC m=+0.112614179 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:26:55 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:26:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57107 DF PROTO=TCP SPT=44898 DPT=9102 SEQ=3126667153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF221870000000001030307) Feb 23 04:26:59 localhost sshd[225590]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:27:00 localhost python3.9[225684]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Feb 23 04:27:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43285 DF PROTO=TCP SPT=33892 DPT=9100 SEQ=2100312813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF230060000000001030307) Feb 23 04:27:02 localhost python3.9[225795]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Feb 23 04:27:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43286 DF PROTO=TCP SPT=33892 DPT=9100 SEQ=2100312813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF238060000000001030307) Feb 23 04:27:03 localhost python3.9[225911]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005626463.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Feb 23 04:27:04 localhost sshd[225937]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:27:04 localhost systemd-logind[759]: New session 54 of user zuul. Feb 23 04:27:04 localhost systemd[1]: Started Session 54 of User zuul. Feb 23 04:27:04 localhost systemd[1]: session-54.scope: Deactivated successfully. Feb 23 04:27:04 localhost systemd-logind[759]: Session 54 logged out. Waiting for processes to exit. Feb 23 04:27:04 localhost systemd-logind[759]: Removed session 54. Feb 23 04:27:05 localhost python3.9[226048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:27:05 localhost python3.9[226103]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:27:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14873 DF PROTO=TCP SPT=55968 DPT=9101 SEQ=144703007 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF245060000000001030307) Feb 23 04:27:06 localhost python3.9[226211]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:27:07 localhost python3.9[226297]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838826.1049607-2628-140247797123933/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:27:07 localhost python3.9[226405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:27:08 localhost python3.9[226491]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838827.1898627-2628-180094893805398/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:27:08 localhost python3.9[226599]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:27:09 localhost python3.9[226685]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838828.2673264-2628-30575015824456/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:27:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31258 DF PROTO=TCP SPT=40670 DPT=9882 SEQ=3324353408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF251420000000001030307) Feb 23 04:27:09 localhost python3.9[226793]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:27:10 localhost python3.9[226879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771838829.352772-2790-3128833922551/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=3012482a375a6db0cadffa2656b647c3720d54e9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:27:10 localhost python3.9[226989]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:11 localhost python3.9[227099]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:12 localhost python3.9[227209]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:27:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31260 DF PROTO=TCP SPT=40670 DPT=9882 SEQ=3324353408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF25D460000000001030307) Feb 23 04:27:12 localhost python3.9[227321]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:14 localhost python3.9[227429]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:27:15 localhost python3.9[227541]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43288 DF PROTO=TCP SPT=33892 DPT=9100 SEQ=2100312813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF268070000000001030307) Feb 23 04:27:16 localhost python3.9[227651]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:27:17 localhost python3.9[227759]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14875 DF PROTO=TCP SPT=55968 DPT=9101 SEQ=144703007 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF276060000000001030307) Feb 23 04:27:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:27:18 localhost podman[227971]: 2026-02-23 09:27:18.916146061 +0000 UTC m=+0.089663169 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible) Feb 23 04:27:18 localhost podman[227971]: 2026-02-23 09:27:18.994266624 +0000 UTC m=+0.167783732 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 23 04:27:19 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:27:19 localhost python3.9[228087]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False Feb 23 04:27:20 localhost python3.9[228197]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 23 04:27:21 localhost python3[228307]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False Feb 23 04:27:21 localhost podman[228344]: Feb 23 04:27:21 localhost podman[228344]: 2026-02-23 09:27:21.948486131 +0000 UTC m=+0.088493823 container create 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 23 04:27:21 localhost podman[228344]: 2026-02-23 09:27:21.903557049 +0000 UTC m=+0.043564761 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 23 04:27:21 localhost python3[228307]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Feb 23 04:27:22 localhost python3.9[228489]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:27:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9776 DF PROTO=TCP SPT=37540 DPT=9102 SEQ=1072854065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF28AC00000000001030307) Feb 23 04:27:24 localhost python3.9[228599]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 23 04:27:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31262 DF PROTO=TCP SPT=40670 DPT=9882 SEQ=3324353408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF28E070000000001030307) Feb 23 04:27:25 localhost python3.9[228709]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:27:25 localhost python3.9[228799]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838844.8507292-3261-26667132994929/.source.yaml _original_basename=.nmikqgrf follow=False checksum=dde8f4b0d63c380bd7f7596e7df827a8064c101b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:27:26 localhost podman[228910]: 2026-02-23 09:27:26.764048404 +0000 UTC m=+0.077347370 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Feb 23 04:27:26 localhost podman[228910]: 2026-02-23 09:27:26.795241287 +0000 UTC m=+0.108540233 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:27:26 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:27:26 localhost python3.9[228909]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9778 DF PROTO=TCP SPT=37540 DPT=9102 SEQ=1072854065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF296C70000000001030307) Feb 23 04:27:27 localhost python3.9[229071]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:27:28 localhost python3.9[229213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:27:28 localhost python3.9[229321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838847.7595446-3360-270404393237033/.source.json _original_basename=.qlvxpjcy follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:29 localhost python3.9[229429]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18776 DF PROTO=TCP SPT=39096 DPT=9100 SEQ=2040014006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2A5460000000001030307) Feb 23 04:27:32 localhost python3.9[229733]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False Feb 23 04:27:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18777 DF PROTO=TCP SPT=39096 DPT=9100 SEQ=2040014006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2AD470000000001030307) Feb 23 04:27:33 localhost sshd[229751]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:27:33 localhost python3.9[229845]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 23 04:27:34 localhost sshd[229956]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:27:34 localhost python3[229955]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False Feb 23 04:27:35 localhost python3[229955]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3",#012 "Digest": "sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-02-23T06:27:42.035349623Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1216089983,#012 "VirtualSize": 1216089983,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",#012 "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",#012 "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",#012 "sha256:5511acb0625eca242fd47549a8bafd7826358a029c48a9158ddd6fa2b7e0b86d",#012 "sha256:1f1e90f8b2058c74071fe0298f6d20f4d1edbde3bdd940d26fcd35c036f677a8"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-02-17T01:25:07.246646992Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:07.246739119Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260216\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:12.132997501Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-02-23T06:08:39.081651802Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081666472Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081677733Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081688343Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081701553Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081710413Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.413481757Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:09:13.490649497Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 23 04:27:35 localhost podman[230005]: 2026-02-23 09:27:35.204022524 +0000 UTC m=+0.096243385 container remove c3f899c2b5ce4f4d7b5858de78d59efc3a8874f152c256dfb1545b9533058442 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '45772c82d00b8348e0440509154d74a9-b5f04eda8e5f004a5ff6ec948b25cc1e'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-type=git, container_name=nova_compute, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, version=17.1.13) Feb 23 04:27:35 localhost python3[229955]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Feb 23 04:27:35 localhost podman[230019]: Feb 23 04:27:35 localhost podman[230019]: 2026-02-23 09:27:35.315530165 +0000 UTC m=+0.090207595 container create 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.43.0, config_id=nova_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:27:35 localhost podman[230019]: 2026-02-23 09:27:35.274637763 +0000 UTC m=+0.049315243 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 23 04:27:35 localhost python3[229955]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Feb 23 04:27:36 localhost python3.9[230166]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:27:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29997 DF PROTO=TCP SPT=51930 DPT=9101 SEQ=2185825708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2BA070000000001030307) Feb 23 04:27:36 localhost python3.9[230278]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:37 localhost python3.9[230333]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:27:37 localhost python3.9[230442]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771838857.405786-3594-3872314354872/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:38 localhost python3.9[230497]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:27:38 localhost systemd[1]: Reloading. Feb 23 04:27:38 localhost systemd-rc-local-generator[230523]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:27:38 localhost systemd-sysv-generator[230527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9780 DF PROTO=TCP SPT=37540 DPT=9102 SEQ=1072854065 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2C6070000000001030307) Feb 23 04:27:39 localhost python3.9[230589]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:27:40 localhost systemd[1]: Reloading. Feb 23 04:27:40 localhost systemd-sysv-generator[230621]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:27:40 localhost systemd-rc-local-generator[230616]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:40 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:27:40 localhost systemd[1]: Starting nova_compute container... Feb 23 04:27:41 localhost systemd[1]: Started libcrun container. Feb 23 04:27:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:41 localhost podman[230630]: 2026-02-23 09:27:41.073056103 +0000 UTC m=+0.133058675 container init 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, config_id=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:27:41 localhost podman[230630]: 2026-02-23 09:27:41.083109403 +0000 UTC m=+0.143111975 container start 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:27:41 localhost podman[230630]: nova_compute Feb 23 04:27:41 localhost nova_compute[230643]: + sudo -E kolla_set_configs Feb 23 04:27:41 localhost systemd[1]: Started nova_compute container. Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Validating config file Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying service configuration files Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Deleting /etc/ceph Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Creating directory /etc/ceph Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Writing out command to execute Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:27:41 localhost nova_compute[230643]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 23 04:27:41 localhost nova_compute[230643]: ++ cat /run_command Feb 23 04:27:41 localhost nova_compute[230643]: + CMD=nova-compute Feb 23 04:27:41 localhost nova_compute[230643]: + ARGS= Feb 23 04:27:41 localhost nova_compute[230643]: + sudo kolla_copy_cacerts Feb 23 04:27:41 localhost nova_compute[230643]: + [[ ! -n '' ]] Feb 23 04:27:41 localhost nova_compute[230643]: + . kolla_extend_start Feb 23 04:27:41 localhost nova_compute[230643]: + echo 'Running command: '\''nova-compute'\''' Feb 23 04:27:41 localhost nova_compute[230643]: Running command: 'nova-compute' Feb 23 04:27:41 localhost nova_compute[230643]: + umask 0022 Feb 23 04:27:41 localhost nova_compute[230643]: + exec nova-compute Feb 23 04:27:41 localhost python3.9[230762]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 23 04:27:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29410 DF PROTO=TCP SPT=39054 DPT=9882 SEQ=3387491264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2D2860000000001030307) Feb 23 04:27:42 localhost nova_compute[230643]: 2026-02-23 09:27:42.875 230647 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:27:42 localhost nova_compute[230643]: 2026-02-23 09:27:42.876 230647 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:27:42 localhost nova_compute[230643]: 2026-02-23 09:27:42.876 230647 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:27:42 localhost nova_compute[230643]: 2026-02-23 09:27:42.876 230647 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.001 230647 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.024 230647 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.025 230647 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.514 230647 INFO nova.virt.driver [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.629 230647 INFO nova.compute.provider_config [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 23 04:27:43 localhost python3.9[230877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.644 230647 WARNING nova.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.645 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.645 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.645 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.645 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.646 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.647 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] console_host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.648 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.649 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.650 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.651 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.652 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.653 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.654 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.655 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.656 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.657 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.658 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.659 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.660 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.661 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.662 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.663 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.663 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.663 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.663 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.665 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.665 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.666 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.666 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.666 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.667 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.667 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.667 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.668 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.668 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.668 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.669 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.669 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.669 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.669 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.670 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.670 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.670 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.671 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.671 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.671 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.671 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.672 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.672 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.672 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.673 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.673 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.673 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.674 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.674 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.674 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.675 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.675 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.675 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.675 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.676 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.676 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.676 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.677 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.677 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.677 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.677 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.678 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.678 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.678 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.679 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.679 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.679 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.680 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.680 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.680 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.680 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.681 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.681 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.681 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.682 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.682 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.682 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.683 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.683 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.683 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.684 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.684 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.684 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.684 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.685 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.685 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.685 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.686 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.686 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.686 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.686 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.687 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.687 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.687 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.688 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.688 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.688 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.688 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.689 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.689 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.689 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.690 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.690 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.690 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.691 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.691 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.691 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.692 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.692 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.692 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.693 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.693 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.693 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.693 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.694 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.694 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.694 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.695 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.695 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.695 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.695 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.696 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.696 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.696 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.697 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.697 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.697 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.698 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.698 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.698 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.699 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.699 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.699 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.700 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.700 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.700 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.700 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.701 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.701 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.702 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.702 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.702 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.702 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.703 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.704 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.705 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.706 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.707 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.708 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.709 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.710 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.711 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.712 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.713 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.714 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.715 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.716 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.717 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.718 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.719 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.720 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.721 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.722 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.722 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.722 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.722 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.723 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.724 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.725 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.726 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.727 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.728 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.729 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.barbican_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.730 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.731 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.732 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.733 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.734 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.735 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.736 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.737 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.738 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.739 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.740 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.741 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.742 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.743 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.744 230647 WARNING oslo_config.cfg [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 23 04:27:43 localhost nova_compute[230643]: live_migration_uri is deprecated for removal in favor of two other options that Feb 23 04:27:43 localhost nova_compute[230643]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 23 04:27:43 localhost nova_compute[230643]: and ``live_migration_inbound_addr`` respectively. Feb 23 04:27:43 localhost nova_compute[230643]: ). Its value may be silently ignored in the future.#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.744 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.745 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.746 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_secret_uuid = f1fea371-cb69-578d-a3d0-b5c472a84b46 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.747 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.748 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.749 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.750 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.751 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.752 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.753 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.754 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.755 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.756 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.757 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.758 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.759 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.760 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.761 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.762 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.763 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.764 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.765 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.766 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.767 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.768 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.769 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.770 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.771 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.772 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.773 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.774 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.775 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.776 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.777 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.778 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.779 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.780 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.781 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.782 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.783 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.784 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.785 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.786 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.787 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.788 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.789 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.790 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.791 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.792 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.793 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.794 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.795 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.796 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.797 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.798 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.799 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.800 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.801 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.802 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.803 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.804 230647 DEBUG oslo_service.service [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.805 230647 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.836 230647 INFO nova.virt.node [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.837 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.838 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.838 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.838 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.848 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.850 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.851 230647 INFO nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Connection event '1' reason 'None'#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.864 230647 DEBUG nova.virt.libvirt.volume.mount [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.873 230647 INFO nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host capabilities Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: bdcaa433-cfc7-450a-99ab-f0985ab59447 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: x86_64 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v4 Feb 23 04:27:43 localhost nova_compute[230643]: AMD Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: tcp Feb 23 04:27:43 localhost nova_compute[230643]: rdma Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: 16116612 Feb 23 04:27:43 localhost nova_compute[230643]: 4029153 Feb 23 04:27:43 localhost nova_compute[230643]: 0 Feb 23 04:27:43 localhost nova_compute[230643]: 0 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: selinux Feb 23 04:27:43 localhost nova_compute[230643]: 0 Feb 23 04:27:43 localhost nova_compute[230643]: system_u:system_r:svirt_t:s0 Feb 23 04:27:43 localhost nova_compute[230643]: system_u:system_r:svirt_tcg_t:s0 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: dac Feb 23 04:27:43 localhost nova_compute[230643]: 0 Feb 23 04:27:43 localhost nova_compute[230643]: +107:+107 Feb 23 04:27:43 localhost nova_compute[230643]: +107:+107 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: hvm Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: 32 Feb 23 04:27:43 localhost nova_compute[230643]: /usr/libexec/qemu-kvm Feb 23 04:27:43 localhost nova_compute[230643]: pc-i440fx-rhel7.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.8.0 Feb 23 04:27:43 localhost nova_compute[230643]: q35 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.4.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.5.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.3.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel7.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.4.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.2.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.2.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.0.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.0.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.1.0 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: hvm Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: 64 Feb 23 04:27:43 localhost nova_compute[230643]: /usr/libexec/qemu-kvm Feb 23 04:27:43 localhost nova_compute[230643]: pc-i440fx-rhel7.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.8.0 Feb 23 04:27:43 localhost nova_compute[230643]: q35 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.4.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.5.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.3.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel7.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.4.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.2.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.2.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.0.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.0.0 Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel8.1.0 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: #033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.878 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.895 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: /usr/libexec/qemu-kvm Feb 23 04:27:43 localhost nova_compute[230643]: kvm Feb 23 04:27:43 localhost nova_compute[230643]: pc-q35-rhel9.8.0 Feb 23 04:27:43 localhost nova_compute[230643]: i686 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: rom Feb 23 04:27:43 localhost nova_compute[230643]: pflash Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: yes Feb 23 04:27:43 localhost nova_compute[230643]: no Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: no Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: on Feb 23 04:27:43 localhost nova_compute[230643]: off Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: on Feb 23 04:27:43 localhost nova_compute[230643]: off Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome Feb 23 04:27:43 localhost nova_compute[230643]: AMD Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: 486 Feb 23 04:27:43 localhost nova_compute[230643]: 486-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-noTSX Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-noTSX-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-noTSX Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v5 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: ClearwaterForest Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: ClearwaterForest-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Conroe Feb 23 04:27:43 localhost nova_compute[230643]: Conroe-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Cooperlake Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cooperlake-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cooperlake-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Denverton Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Denverton-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Denverton-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Denverton-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Dhyana Feb 23 04:27:43 localhost nova_compute[230643]: Dhyana-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Dhyana-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Genoa Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Genoa-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Genoa-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-IBPB Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Milan Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Milan-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Milan-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Milan-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v4 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v5 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Turin Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Turin-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v1 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v2 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v5 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: GraniteRapids Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: GraniteRapids-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: GraniteRapids-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: GraniteRapids-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Haswell Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Haswell-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Haswell-noTSX Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Haswell-noTSX-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Haswell-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Haswell-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Haswell-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Haswell-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server-noTSX Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server-v5 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server-v6 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Icelake-Server-v7 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: IvyBridge Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: IvyBridge-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: IvyBridge-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: IvyBridge-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: KnightsMill Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: KnightsMill-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Nehalem Feb 23 04:27:43 localhost nova_compute[230643]: Nehalem-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Nehalem-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Nehalem-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G1 Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G1-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G2 Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G2-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G3 Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G3-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G4-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G5 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Opteron_G5-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Penryn Feb 23 04:27:43 localhost nova_compute[230643]: Penryn-v1 Feb 23 04:27:43 localhost nova_compute[230643]: SandyBridge Feb 23 04:27:43 localhost nova_compute[230643]: SandyBridge-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: SandyBridge-v1 Feb 23 04:27:43 localhost nova_compute[230643]: SandyBridge-v2 Feb 23 04:27:43 localhost nova_compute[230643]: SapphireRapids Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: SapphireRapids-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: SapphireRapids-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: SapphireRapids-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: SapphireRapids-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: SierraForest Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: SierraForest-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: SierraForest-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: SierraForest-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Client Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Client-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Client-noTSX-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Client-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Client-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Client-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Client-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Server Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Server-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Server-noTSX-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Server-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Server-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Server-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Server-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Skylake-Server-v5 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Snowridge Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Snowridge-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Snowridge-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Snowridge-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Snowridge-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Westmere Feb 23 04:27:43 localhost nova_compute[230643]: Westmere-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Westmere-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Westmere-v2 Feb 23 04:27:43 localhost nova_compute[230643]: athlon Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: athlon-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: core2duo Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: core2duo-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: coreduo Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: coreduo-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: kvm32 Feb 23 04:27:43 localhost nova_compute[230643]: kvm32-v1 Feb 23 04:27:43 localhost nova_compute[230643]: kvm64 Feb 23 04:27:43 localhost nova_compute[230643]: kvm64-v1 Feb 23 04:27:43 localhost nova_compute[230643]: n270 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: n270-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: pentium Feb 23 04:27:43 localhost nova_compute[230643]: pentium-v1 Feb 23 04:27:43 localhost nova_compute[230643]: pentium2 Feb 23 04:27:43 localhost nova_compute[230643]: pentium2-v1 Feb 23 04:27:43 localhost nova_compute[230643]: pentium3 Feb 23 04:27:43 localhost nova_compute[230643]: pentium3-v1 Feb 23 04:27:43 localhost nova_compute[230643]: phenom Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: phenom-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: qemu32 Feb 23 04:27:43 localhost nova_compute[230643]: qemu32-v1 Feb 23 04:27:43 localhost nova_compute[230643]: qemu64 Feb 23 04:27:43 localhost nova_compute[230643]: qemu64-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: file Feb 23 04:27:43 localhost nova_compute[230643]: anonymous Feb 23 04:27:43 localhost nova_compute[230643]: memfd Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: disk Feb 23 04:27:43 localhost nova_compute[230643]: cdrom Feb 23 04:27:43 localhost nova_compute[230643]: floppy Feb 23 04:27:43 localhost nova_compute[230643]: lun Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: fdc Feb 23 04:27:43 localhost nova_compute[230643]: scsi Feb 23 04:27:43 localhost nova_compute[230643]: virtio Feb 23 04:27:43 localhost nova_compute[230643]: usb Feb 23 04:27:43 localhost nova_compute[230643]: sata Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: virtio Feb 23 04:27:43 localhost nova_compute[230643]: virtio-transitional Feb 23 04:27:43 localhost nova_compute[230643]: virtio-non-transitional Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: vnc Feb 23 04:27:43 localhost nova_compute[230643]: egl-headless Feb 23 04:27:43 localhost nova_compute[230643]: dbus Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: subsystem Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: default Feb 23 04:27:43 localhost nova_compute[230643]: mandatory Feb 23 04:27:43 localhost nova_compute[230643]: requisite Feb 23 04:27:43 localhost nova_compute[230643]: optional Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: usb Feb 23 04:27:43 localhost nova_compute[230643]: pci Feb 23 04:27:43 localhost nova_compute[230643]: scsi Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: virtio Feb 23 04:27:43 localhost nova_compute[230643]: virtio-transitional Feb 23 04:27:43 localhost nova_compute[230643]: virtio-non-transitional Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: random Feb 23 04:27:43 localhost nova_compute[230643]: egd Feb 23 04:27:43 localhost nova_compute[230643]: builtin Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: path Feb 23 04:27:43 localhost nova_compute[230643]: handle Feb 23 04:27:43 localhost nova_compute[230643]: virtiofs Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: tpm-tis Feb 23 04:27:43 localhost nova_compute[230643]: tpm-crb Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: emulator Feb 23 04:27:43 localhost nova_compute[230643]: external Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: 2.0 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: usb Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: pty Feb 23 04:27:43 localhost nova_compute[230643]: unix Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: qemu Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: builtin Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: default Feb 23 04:27:43 localhost nova_compute[230643]: passt Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: isa Feb 23 04:27:43 localhost nova_compute[230643]: hyperv Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: null Feb 23 04:27:43 localhost nova_compute[230643]: vc Feb 23 04:27:43 localhost nova_compute[230643]: pty Feb 23 04:27:43 localhost nova_compute[230643]: dev Feb 23 04:27:43 localhost nova_compute[230643]: file Feb 23 04:27:43 localhost nova_compute[230643]: pipe Feb 23 04:27:43 localhost nova_compute[230643]: stdio Feb 23 04:27:43 localhost nova_compute[230643]: udp Feb 23 04:27:43 localhost nova_compute[230643]: tcp Feb 23 04:27:43 localhost nova_compute[230643]: unix Feb 23 04:27:43 localhost nova_compute[230643]: qemu-vdagent Feb 23 04:27:43 localhost nova_compute[230643]: dbus Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: relaxed Feb 23 04:27:43 localhost nova_compute[230643]: vapic Feb 23 04:27:43 localhost nova_compute[230643]: spinlocks Feb 23 04:27:43 localhost nova_compute[230643]: vpindex Feb 23 04:27:43 localhost nova_compute[230643]: runtime Feb 23 04:27:43 localhost nova_compute[230643]: synic Feb 23 04:27:43 localhost nova_compute[230643]: stimer Feb 23 04:27:43 localhost nova_compute[230643]: reset Feb 23 04:27:43 localhost nova_compute[230643]: vendor_id Feb 23 04:27:43 localhost nova_compute[230643]: frequencies Feb 23 04:27:43 localhost nova_compute[230643]: reenlightenment Feb 23 04:27:43 localhost nova_compute[230643]: tlbflush Feb 23 04:27:43 localhost nova_compute[230643]: ipi Feb 23 04:27:43 localhost nova_compute[230643]: avic Feb 23 04:27:43 localhost nova_compute[230643]: emsr_bitmap Feb 23 04:27:43 localhost nova_compute[230643]: xmm_input Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: 4095 Feb 23 04:27:43 localhost nova_compute[230643]: on Feb 23 04:27:43 localhost nova_compute[230643]: off Feb 23 04:27:43 localhost nova_compute[230643]: off Feb 23 04:27:43 localhost nova_compute[230643]: Linux KVM Hv Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:27:43 localhost nova_compute[230643]: 2026-02-23 09:27:43.901 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: /usr/libexec/qemu-kvm Feb 23 04:27:43 localhost nova_compute[230643]: kvm Feb 23 04:27:43 localhost nova_compute[230643]: pc-i440fx-rhel7.6.0 Feb 23 04:27:43 localhost nova_compute[230643]: i686 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: rom Feb 23 04:27:43 localhost nova_compute[230643]: pflash Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: yes Feb 23 04:27:43 localhost nova_compute[230643]: no Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: no Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: on Feb 23 04:27:43 localhost nova_compute[230643]: off Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: on Feb 23 04:27:43 localhost nova_compute[230643]: off Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome Feb 23 04:27:43 localhost nova_compute[230643]: AMD Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: 486 Feb 23 04:27:43 localhost nova_compute[230643]: 486-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-noTSX Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-noTSX-IBRS Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Broadwell-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-noTSX Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cascadelake-Server-v5 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: ClearwaterForest Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: ClearwaterForest-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Conroe Feb 23 04:27:43 localhost nova_compute[230643]: Conroe-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Cooperlake Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cooperlake-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Cooperlake-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Denverton Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Denverton-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Denverton-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Denverton-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Dhyana Feb 23 04:27:43 localhost nova_compute[230643]: Dhyana-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Dhyana-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Genoa Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Genoa-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Genoa-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-IBPB Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Milan Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Milan-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Milan-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Milan-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v2 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v4 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Rome-v5 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Turin Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-Turin-v1 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v1 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v2 Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v3 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v4 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: EPYC-v5 Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: GraniteRapids Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:43 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v6 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v7 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: KnightsMill Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: KnightsMill-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G1-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G2 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G2-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G3 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G3-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G4-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G5-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Penryn Feb 23 04:27:44 localhost nova_compute[230643]: Penryn-v1 Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Westmere Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-v2 Feb 23 04:27:44 localhost nova_compute[230643]: athlon Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: athlon-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: core2duo Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: core2duo-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: coreduo Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: coreduo-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: kvm32 Feb 23 04:27:44 localhost nova_compute[230643]: kvm32-v1 Feb 23 04:27:44 localhost nova_compute[230643]: kvm64 Feb 23 04:27:44 localhost nova_compute[230643]: kvm64-v1 Feb 23 04:27:44 localhost nova_compute[230643]: n270 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: n270-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: pentium Feb 23 04:27:44 localhost nova_compute[230643]: pentium-v1 Feb 23 04:27:44 localhost nova_compute[230643]: pentium2 Feb 23 04:27:44 localhost nova_compute[230643]: pentium2-v1 Feb 23 04:27:44 localhost nova_compute[230643]: pentium3 Feb 23 04:27:44 localhost nova_compute[230643]: pentium3-v1 Feb 23 04:27:44 localhost nova_compute[230643]: phenom Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: phenom-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: qemu32 Feb 23 04:27:44 localhost nova_compute[230643]: qemu32-v1 Feb 23 04:27:44 localhost nova_compute[230643]: qemu64 Feb 23 04:27:44 localhost nova_compute[230643]: qemu64-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: file Feb 23 04:27:44 localhost nova_compute[230643]: anonymous Feb 23 04:27:44 localhost nova_compute[230643]: memfd Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: disk Feb 23 04:27:44 localhost nova_compute[230643]: cdrom Feb 23 04:27:44 localhost nova_compute[230643]: floppy Feb 23 04:27:44 localhost nova_compute[230643]: lun Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: ide Feb 23 04:27:44 localhost nova_compute[230643]: fdc Feb 23 04:27:44 localhost nova_compute[230643]: scsi Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: sata Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: virtio-transitional Feb 23 04:27:44 localhost nova_compute[230643]: virtio-non-transitional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: vnc Feb 23 04:27:44 localhost nova_compute[230643]: egl-headless Feb 23 04:27:44 localhost nova_compute[230643]: dbus Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: subsystem Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: default Feb 23 04:27:44 localhost nova_compute[230643]: mandatory Feb 23 04:27:44 localhost nova_compute[230643]: requisite Feb 23 04:27:44 localhost nova_compute[230643]: optional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: pci Feb 23 04:27:44 localhost nova_compute[230643]: scsi Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: virtio-transitional Feb 23 04:27:44 localhost nova_compute[230643]: virtio-non-transitional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: random Feb 23 04:27:44 localhost nova_compute[230643]: egd Feb 23 04:27:44 localhost nova_compute[230643]: builtin Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: path Feb 23 04:27:44 localhost nova_compute[230643]: handle Feb 23 04:27:44 localhost nova_compute[230643]: virtiofs Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: tpm-tis Feb 23 04:27:44 localhost nova_compute[230643]: tpm-crb Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: emulator Feb 23 04:27:44 localhost nova_compute[230643]: external Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: 2.0 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: pty Feb 23 04:27:44 localhost nova_compute[230643]: unix Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: qemu Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: builtin Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: default Feb 23 04:27:44 localhost nova_compute[230643]: passt Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: isa Feb 23 04:27:44 localhost nova_compute[230643]: hyperv Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: null Feb 23 04:27:44 localhost nova_compute[230643]: vc Feb 23 04:27:44 localhost nova_compute[230643]: pty Feb 23 04:27:44 localhost nova_compute[230643]: dev Feb 23 04:27:44 localhost nova_compute[230643]: file Feb 23 04:27:44 localhost nova_compute[230643]: pipe Feb 23 04:27:44 localhost nova_compute[230643]: stdio Feb 23 04:27:44 localhost nova_compute[230643]: udp Feb 23 04:27:44 localhost nova_compute[230643]: tcp Feb 23 04:27:44 localhost nova_compute[230643]: unix Feb 23 04:27:44 localhost nova_compute[230643]: qemu-vdagent Feb 23 04:27:44 localhost nova_compute[230643]: dbus Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: relaxed Feb 23 04:27:44 localhost nova_compute[230643]: vapic Feb 23 04:27:44 localhost nova_compute[230643]: spinlocks Feb 23 04:27:44 localhost nova_compute[230643]: vpindex Feb 23 04:27:44 localhost nova_compute[230643]: runtime Feb 23 04:27:44 localhost nova_compute[230643]: synic Feb 23 04:27:44 localhost nova_compute[230643]: stimer Feb 23 04:27:44 localhost nova_compute[230643]: reset Feb 23 04:27:44 localhost nova_compute[230643]: vendor_id Feb 23 04:27:44 localhost nova_compute[230643]: frequencies Feb 23 04:27:44 localhost nova_compute[230643]: reenlightenment Feb 23 04:27:44 localhost nova_compute[230643]: tlbflush Feb 23 04:27:44 localhost nova_compute[230643]: ipi Feb 23 04:27:44 localhost nova_compute[230643]: avic Feb 23 04:27:44 localhost nova_compute[230643]: emsr_bitmap Feb 23 04:27:44 localhost nova_compute[230643]: xmm_input Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: 4095 Feb 23 04:27:44 localhost nova_compute[230643]: on Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: Linux KVM Hv Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:43.962 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:43.968 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: /usr/libexec/qemu-kvm Feb 23 04:27:44 localhost nova_compute[230643]: kvm Feb 23 04:27:44 localhost nova_compute[230643]: pc-q35-rhel9.8.0 Feb 23 04:27:44 localhost nova_compute[230643]: x86_64 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: efi Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 23 04:27:44 localhost nova_compute[230643]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 23 04:27:44 localhost nova_compute[230643]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 23 04:27:44 localhost nova_compute[230643]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: rom Feb 23 04:27:44 localhost nova_compute[230643]: pflash Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: yes Feb 23 04:27:44 localhost nova_compute[230643]: no Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: yes Feb 23 04:27:44 localhost nova_compute[230643]: no Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: on Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: on Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome Feb 23 04:27:44 localhost nova_compute[230643]: AMD Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: 486 Feb 23 04:27:44 localhost nova_compute[230643]: 486-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: ClearwaterForest Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: ClearwaterForest-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Conroe Feb 23 04:27:44 localhost nova_compute[230643]: Conroe-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Cooperlake Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cooperlake-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cooperlake-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Denverton Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Denverton-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Denverton-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Denverton-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Dhyana Feb 23 04:27:44 localhost nova_compute[230643]: Dhyana-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Dhyana-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Genoa Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Genoa-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Genoa-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-IBPB Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Milan Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Milan-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Milan-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Milan-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v4 Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v5 Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Turin Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Turin-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v1 Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v2 Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v6 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v7 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: KnightsMill Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: KnightsMill-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G1-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G2 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G2-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G3 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G3-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G4-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G5-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Penryn Feb 23 04:27:44 localhost nova_compute[230643]: Penryn-v1 Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Westmere Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-v2 Feb 23 04:27:44 localhost nova_compute[230643]: athlon Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: athlon-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: core2duo Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: core2duo-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: coreduo Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: coreduo-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: kvm32 Feb 23 04:27:44 localhost nova_compute[230643]: kvm32-v1 Feb 23 04:27:44 localhost nova_compute[230643]: kvm64 Feb 23 04:27:44 localhost nova_compute[230643]: kvm64-v1 Feb 23 04:27:44 localhost nova_compute[230643]: n270 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: n270-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: pentium Feb 23 04:27:44 localhost nova_compute[230643]: pentium-v1 Feb 23 04:27:44 localhost nova_compute[230643]: pentium2 Feb 23 04:27:44 localhost nova_compute[230643]: pentium2-v1 Feb 23 04:27:44 localhost nova_compute[230643]: pentium3 Feb 23 04:27:44 localhost nova_compute[230643]: pentium3-v1 Feb 23 04:27:44 localhost nova_compute[230643]: phenom Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: phenom-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: qemu32 Feb 23 04:27:44 localhost nova_compute[230643]: qemu32-v1 Feb 23 04:27:44 localhost nova_compute[230643]: qemu64 Feb 23 04:27:44 localhost nova_compute[230643]: qemu64-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: file Feb 23 04:27:44 localhost nova_compute[230643]: anonymous Feb 23 04:27:44 localhost nova_compute[230643]: memfd Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: disk Feb 23 04:27:44 localhost nova_compute[230643]: cdrom Feb 23 04:27:44 localhost nova_compute[230643]: floppy Feb 23 04:27:44 localhost nova_compute[230643]: lun Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: fdc Feb 23 04:27:44 localhost nova_compute[230643]: scsi Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: sata Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: virtio-transitional Feb 23 04:27:44 localhost nova_compute[230643]: virtio-non-transitional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: vnc Feb 23 04:27:44 localhost nova_compute[230643]: egl-headless Feb 23 04:27:44 localhost nova_compute[230643]: dbus Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: subsystem Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: default Feb 23 04:27:44 localhost nova_compute[230643]: mandatory Feb 23 04:27:44 localhost nova_compute[230643]: requisite Feb 23 04:27:44 localhost nova_compute[230643]: optional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: pci Feb 23 04:27:44 localhost nova_compute[230643]: scsi Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: virtio-transitional Feb 23 04:27:44 localhost nova_compute[230643]: virtio-non-transitional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: random Feb 23 04:27:44 localhost nova_compute[230643]: egd Feb 23 04:27:44 localhost nova_compute[230643]: builtin Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: path Feb 23 04:27:44 localhost nova_compute[230643]: handle Feb 23 04:27:44 localhost nova_compute[230643]: virtiofs Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: tpm-tis Feb 23 04:27:44 localhost nova_compute[230643]: tpm-crb Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: emulator Feb 23 04:27:44 localhost nova_compute[230643]: external Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: 2.0 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: pty Feb 23 04:27:44 localhost nova_compute[230643]: unix Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: qemu Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: builtin Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: default Feb 23 04:27:44 localhost nova_compute[230643]: passt Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: isa Feb 23 04:27:44 localhost nova_compute[230643]: hyperv Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: null Feb 23 04:27:44 localhost nova_compute[230643]: vc Feb 23 04:27:44 localhost nova_compute[230643]: pty Feb 23 04:27:44 localhost nova_compute[230643]: dev Feb 23 04:27:44 localhost nova_compute[230643]: file Feb 23 04:27:44 localhost nova_compute[230643]: pipe Feb 23 04:27:44 localhost nova_compute[230643]: stdio Feb 23 04:27:44 localhost nova_compute[230643]: udp Feb 23 04:27:44 localhost nova_compute[230643]: tcp Feb 23 04:27:44 localhost nova_compute[230643]: unix Feb 23 04:27:44 localhost nova_compute[230643]: qemu-vdagent Feb 23 04:27:44 localhost nova_compute[230643]: dbus Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: relaxed Feb 23 04:27:44 localhost nova_compute[230643]: vapic Feb 23 04:27:44 localhost nova_compute[230643]: spinlocks Feb 23 04:27:44 localhost nova_compute[230643]: vpindex Feb 23 04:27:44 localhost nova_compute[230643]: runtime Feb 23 04:27:44 localhost nova_compute[230643]: synic Feb 23 04:27:44 localhost nova_compute[230643]: stimer Feb 23 04:27:44 localhost nova_compute[230643]: reset Feb 23 04:27:44 localhost nova_compute[230643]: vendor_id Feb 23 04:27:44 localhost nova_compute[230643]: frequencies Feb 23 04:27:44 localhost nova_compute[230643]: reenlightenment Feb 23 04:27:44 localhost nova_compute[230643]: tlbflush Feb 23 04:27:44 localhost nova_compute[230643]: ipi Feb 23 04:27:44 localhost nova_compute[230643]: avic Feb 23 04:27:44 localhost nova_compute[230643]: emsr_bitmap Feb 23 04:27:44 localhost nova_compute[230643]: xmm_input Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: 4095 Feb 23 04:27:44 localhost nova_compute[230643]: on Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: Linux KVM Hv Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.042 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: /usr/libexec/qemu-kvm Feb 23 04:27:44 localhost nova_compute[230643]: kvm Feb 23 04:27:44 localhost nova_compute[230643]: pc-i440fx-rhel7.6.0 Feb 23 04:27:44 localhost nova_compute[230643]: x86_64 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: rom Feb 23 04:27:44 localhost nova_compute[230643]: pflash Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: yes Feb 23 04:27:44 localhost nova_compute[230643]: no Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: no Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: on Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: on Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome Feb 23 04:27:44 localhost nova_compute[230643]: AMD Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: 486 Feb 23 04:27:44 localhost nova_compute[230643]: 486-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Broadwell-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cascadelake-Server-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: ClearwaterForest Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: ClearwaterForest-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Conroe Feb 23 04:27:44 localhost nova_compute[230643]: Conroe-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Cooperlake Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cooperlake-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Cooperlake-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Denverton Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Denverton-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Denverton-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Denverton-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Dhyana Feb 23 04:27:44 localhost nova_compute[230643]: Dhyana-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Dhyana-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Genoa Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Genoa-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Genoa-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-IBPB Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Milan Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Milan-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Milan-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Milan-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v4 Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Rome-v5 Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Turin Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-Turin-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v1 Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v2 Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: EPYC-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: GraniteRapids-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Haswell-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-noTSX Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v6 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Icelake-Server-v7 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: IvyBridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: KnightsMill Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: KnightsMill-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Nehalem-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G1-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G2 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G2-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G3 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G3-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G4-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Opteron_G5-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Penryn Feb 23 04:27:44 localhost nova_compute[230643]: Penryn-v1 Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: SandyBridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SapphireRapids-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: SierraForest-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Client-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-noTSX-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Skylake-Server-v5 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v2 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v3 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Snowridge-v4 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Westmere Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-IBRS Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Westmere-v2 Feb 23 04:27:44 localhost nova_compute[230643]: athlon Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: athlon-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: core2duo Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: core2duo-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: coreduo Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: coreduo-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: kvm32 Feb 23 04:27:44 localhost nova_compute[230643]: kvm32-v1 Feb 23 04:27:44 localhost nova_compute[230643]: kvm64 Feb 23 04:27:44 localhost nova_compute[230643]: kvm64-v1 Feb 23 04:27:44 localhost nova_compute[230643]: n270 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: n270-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: pentium Feb 23 04:27:44 localhost nova_compute[230643]: pentium-v1 Feb 23 04:27:44 localhost nova_compute[230643]: pentium2 Feb 23 04:27:44 localhost nova_compute[230643]: pentium2-v1 Feb 23 04:27:44 localhost nova_compute[230643]: pentium3 Feb 23 04:27:44 localhost nova_compute[230643]: pentium3-v1 Feb 23 04:27:44 localhost nova_compute[230643]: phenom Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: phenom-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: qemu32 Feb 23 04:27:44 localhost nova_compute[230643]: qemu32-v1 Feb 23 04:27:44 localhost nova_compute[230643]: qemu64 Feb 23 04:27:44 localhost nova_compute[230643]: qemu64-v1 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: file Feb 23 04:27:44 localhost nova_compute[230643]: anonymous Feb 23 04:27:44 localhost nova_compute[230643]: memfd Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: disk Feb 23 04:27:44 localhost nova_compute[230643]: cdrom Feb 23 04:27:44 localhost nova_compute[230643]: floppy Feb 23 04:27:44 localhost nova_compute[230643]: lun Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: ide Feb 23 04:27:44 localhost nova_compute[230643]: fdc Feb 23 04:27:44 localhost nova_compute[230643]: scsi Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: sata Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: virtio-transitional Feb 23 04:27:44 localhost nova_compute[230643]: virtio-non-transitional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: vnc Feb 23 04:27:44 localhost nova_compute[230643]: egl-headless Feb 23 04:27:44 localhost nova_compute[230643]: dbus Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: subsystem Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: default Feb 23 04:27:44 localhost nova_compute[230643]: mandatory Feb 23 04:27:44 localhost nova_compute[230643]: requisite Feb 23 04:27:44 localhost nova_compute[230643]: optional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: pci Feb 23 04:27:44 localhost nova_compute[230643]: scsi Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: virtio Feb 23 04:27:44 localhost nova_compute[230643]: virtio-transitional Feb 23 04:27:44 localhost nova_compute[230643]: virtio-non-transitional Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: random Feb 23 04:27:44 localhost nova_compute[230643]: egd Feb 23 04:27:44 localhost nova_compute[230643]: builtin Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: path Feb 23 04:27:44 localhost nova_compute[230643]: handle Feb 23 04:27:44 localhost nova_compute[230643]: virtiofs Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: tpm-tis Feb 23 04:27:44 localhost nova_compute[230643]: tpm-crb Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: emulator Feb 23 04:27:44 localhost nova_compute[230643]: external Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: 2.0 Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: usb Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: pty Feb 23 04:27:44 localhost nova_compute[230643]: unix Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: qemu Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: builtin Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: default Feb 23 04:27:44 localhost nova_compute[230643]: passt Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: isa Feb 23 04:27:44 localhost nova_compute[230643]: hyperv Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: null Feb 23 04:27:44 localhost nova_compute[230643]: vc Feb 23 04:27:44 localhost nova_compute[230643]: pty Feb 23 04:27:44 localhost nova_compute[230643]: dev Feb 23 04:27:44 localhost nova_compute[230643]: file Feb 23 04:27:44 localhost nova_compute[230643]: pipe Feb 23 04:27:44 localhost nova_compute[230643]: stdio Feb 23 04:27:44 localhost nova_compute[230643]: udp Feb 23 04:27:44 localhost nova_compute[230643]: tcp Feb 23 04:27:44 localhost nova_compute[230643]: unix Feb 23 04:27:44 localhost nova_compute[230643]: qemu-vdagent Feb 23 04:27:44 localhost nova_compute[230643]: dbus Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: relaxed Feb 23 04:27:44 localhost nova_compute[230643]: vapic Feb 23 04:27:44 localhost nova_compute[230643]: spinlocks Feb 23 04:27:44 localhost nova_compute[230643]: vpindex Feb 23 04:27:44 localhost nova_compute[230643]: runtime Feb 23 04:27:44 localhost nova_compute[230643]: synic Feb 23 04:27:44 localhost nova_compute[230643]: stimer Feb 23 04:27:44 localhost nova_compute[230643]: reset Feb 23 04:27:44 localhost nova_compute[230643]: vendor_id Feb 23 04:27:44 localhost nova_compute[230643]: frequencies Feb 23 04:27:44 localhost nova_compute[230643]: reenlightenment Feb 23 04:27:44 localhost nova_compute[230643]: tlbflush Feb 23 04:27:44 localhost nova_compute[230643]: ipi Feb 23 04:27:44 localhost nova_compute[230643]: avic Feb 23 04:27:44 localhost nova_compute[230643]: emsr_bitmap Feb 23 04:27:44 localhost nova_compute[230643]: xmm_input Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: 4095 Feb 23 04:27:44 localhost nova_compute[230643]: on Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: off Feb 23 04:27:44 localhost nova_compute[230643]: Linux KVM Hv Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: Feb 23 04:27:44 localhost nova_compute[230643]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.106 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.107 230647 INFO nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Secure Boot support detected#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.108 230647 INFO nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.109 230647 INFO nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.122 230647 DEBUG nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.172 230647 INFO nova.virt.node [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.200 230647 DEBUG nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Verified node be63d86c-a403-4ec9-a515-07ea2962cb4d matches my host np0005626463.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Feb 23 04:27:44 localhost python3.9[230992]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771838863.1308303-3729-25537200188804/.source.yaml _original_basename=.b5f3od9l follow=False checksum=c0274b4e8da702f77c15fa25b71400f0f9b8a680 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.250 230647 DEBUG nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.254 230647 DEBUG nova.virt.libvirt.vif [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005626463.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-02-23T08:23:11Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.254 230647 DEBUG nova.network.os_vif_util [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.255 230647 DEBUG nova.network.os_vif_util [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.255 230647 DEBUG os_vif [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.314 230647 DEBUG ovsdbapp.backend.ovs_idl [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.314 230647 DEBUG ovsdbapp.backend.ovs_idl [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.314 230647 DEBUG ovsdbapp.backend.ovs_idl [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.314 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.315 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.315 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.315 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.316 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.318 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.329 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.329 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.329 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.330 230647 INFO oslo.privsep.daemon [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpj4hijz_z/privsep.sock']#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.948 230647 INFO oslo.privsep.daemon [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.827 231014 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.832 231014 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.835 231014 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Feb 23 04:27:44 localhost nova_compute[230643]: 2026-02-23 09:27:44.835 231014 INFO oslo.privsep.daemon [-] privsep daemon running as pid 231014#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.231 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.232 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa27e5011-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.232 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa27e5011-20, col_values=(('external_ids', {'iface-id': 'a27e5011-2016-4b16-b5e8-04b555b30bc4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:9d:00', 'vm-uuid': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.233 230647 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.234 230647 INFO os_vif [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20')#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.234 230647 DEBUG nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.238 230647 DEBUG nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.239 230647 INFO nova.compute.manager [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Feb 23 04:27:45 localhost python3.9[231108]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:27:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38480 DF PROTO=TCP SPT=41330 DPT=9105 SEQ=392615369 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2DE060000000001030307) Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.818 230647 INFO nova.service [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating service version for nova-compute on np0005626463.localdomain from 57 to 66#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.864 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.864 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.864 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.864 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:27:45 localhost nova_compute[230643]: 2026-02-23 09:27:45.865 230647 DEBUG oslo_concurrency.processutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.262 230647 DEBUG oslo_concurrency.processutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:27:46 localhost python3.9[231236]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.339 230647 DEBUG nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.339 230647 DEBUG nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:27:46 localhost systemd[1]: Started libvirt nodedev daemon. Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.680 230647 WARNING nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.682 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12922MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.682 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.682 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.819 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.819 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.820 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.836 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.907 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.907 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.927 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.950 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_SSSE3,HW_CPU_X86_SSE2,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_BMI2,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI,HW_CPU_X86_SSE,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_FMA3,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_LAN9118,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SHA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 04:27:46 localhost nova_compute[230643]: 2026-02-23 09:27:46.988 230647 DEBUG oslo_concurrency.processutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:27:47 localhost python3.9[231369]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.467 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.494 230647 DEBUG oslo_concurrency.processutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.500 230647 DEBUG nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Feb 23 04:27:47 localhost nova_compute[230643]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.500 230647 INFO nova.virt.libvirt.host [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] kernel doesn't support AMD SEV#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.502 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.503 230647 DEBUG nova.virt.libvirt.driver [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.564 230647 DEBUG nova.scheduler.client.report [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updated inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.565 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.565 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.645 230647 DEBUG nova.compute.provider_tree [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Updating resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.728 230647 DEBUG nova.compute.resource_tracker [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.728 230647 DEBUG oslo_concurrency.lockutils [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.729 230647 DEBUG nova.service [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.826 230647 DEBUG nova.service [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Feb 23 04:27:47 localhost nova_compute[230643]: 2026-02-23 09:27:47.827 230647 DEBUG nova.servicegroup.drivers.db [None req-b3cc632c-4226-49a1-abb9-cf70f5f05fbd - - - - - -] DB_Driver: join new ServiceGroup member np0005626463.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Feb 23 04:27:48 localhost python3.9[231501]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 23 04:27:48 localhost systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 121.6 (405 of 333 items), suggesting rotation. Feb 23 04:27:48 localhost systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 04:27:48 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:27:48 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:27:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29999 DF PROTO=TCP SPT=51930 DPT=9101 SEQ=2185825708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF2EA070000000001030307) Feb 23 04:27:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:27:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:27:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:27:48.527 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:27:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:27:48.529 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:27:49 localhost python3.9[231634]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:27:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:27:49 localhost systemd[1]: Stopping nova_compute container... Feb 23 04:27:49 localhost systemd[1]: tmp-crun.NXrBbM.mount: Deactivated successfully. Feb 23 04:27:49 localhost podman[231636]: 2026-02-23 09:27:49.202970542 +0000 UTC m=+0.097130801 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 23 04:27:49 localhost nova_compute[230643]: 2026-02-23 09:27:49.272 230647 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Feb 23 04:27:49 localhost podman[231636]: 2026-02-23 09:27:49.307384431 +0000 UTC m=+0.201544730 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2) Feb 23 04:27:49 localhost nova_compute[230643]: 2026-02-23 09:27:49.318 230647 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:27:49 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:27:49 localhost nova_compute[230643]: 2026-02-23 09:27:49.904 230647 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Feb 23 04:27:49 localhost nova_compute[230643]: 2026-02-23 09:27:49.906 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:27:49 localhost nova_compute[230643]: 2026-02-23 09:27:49.907 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:27:49 localhost nova_compute[230643]: 2026-02-23 09:27:49.907 230647 DEBUG oslo_concurrency.lockutils [None req-736b5caa-a8c6-428c-9749-ae829c2f5949 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:27:50 localhost journal[207530]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, ) Feb 23 04:27:50 localhost journal[207530]: hostname: np0005626463.localdomain Feb 23 04:27:50 localhost journal[207530]: End of file while reading data: Input/output error Feb 23 04:27:50 localhost systemd[1]: libpod-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d.scope: Deactivated successfully. Feb 23 04:27:50 localhost systemd[1]: libpod-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d.scope: Consumed 4.822s CPU time. Feb 23 04:27:50 localhost podman[231644]: 2026-02-23 09:27:50.360825066 +0000 UTC m=+1.229195845 container died 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, io.buildah.version=1.43.0, tcib_managed=true, container_name=nova_compute) Feb 23 04:27:50 localhost systemd[1]: tmp-crun.zKRho6.mount: Deactivated successfully. Feb 23 04:27:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d-userdata-shm.mount: Deactivated successfully. Feb 23 04:27:50 localhost podman[231644]: 2026-02-23 09:27:50.416756727 +0000 UTC m=+1.285127506 container cleanup 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 23 04:27:50 localhost podman[231644]: nova_compute Feb 23 04:27:50 localhost podman[231703]: error opening file `/run/crun/8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d/status`: No such file or directory Feb 23 04:27:50 localhost podman[231692]: 2026-02-23 09:27:50.518076153 +0000 UTC m=+0.066902480 container cleanup 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=nova_compute) Feb 23 04:27:50 localhost podman[231692]: nova_compute Feb 23 04:27:50 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Feb 23 04:27:50 localhost systemd[1]: Stopped nova_compute container. Feb 23 04:27:50 localhost systemd[1]: Starting nova_compute container... Feb 23 04:27:50 localhost systemd[1]: Started libcrun container. Feb 23 04:27:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:50 localhost podman[231705]: 2026-02-23 09:27:50.661089124 +0000 UTC m=+0.115333525 container init 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:27:50 localhost podman[231705]: 2026-02-23 09:27:50.671609339 +0000 UTC m=+0.125853740 container start 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, io.buildah.version=1.43.0, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Feb 23 04:27:50 localhost podman[231705]: nova_compute Feb 23 04:27:50 localhost nova_compute[231721]: + sudo -E kolla_set_configs Feb 23 04:27:50 localhost systemd[1]: Started nova_compute container. Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Validating config file Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying service configuration files Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /etc/ceph Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Creating directory /etc/ceph Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Writing out command to execute Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:27:50 localhost nova_compute[231721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 23 04:27:50 localhost nova_compute[231721]: ++ cat /run_command Feb 23 04:27:50 localhost nova_compute[231721]: + CMD=nova-compute Feb 23 04:27:50 localhost nova_compute[231721]: + ARGS= Feb 23 04:27:50 localhost nova_compute[231721]: + sudo kolla_copy_cacerts Feb 23 04:27:50 localhost nova_compute[231721]: + [[ ! -n '' ]] Feb 23 04:27:50 localhost nova_compute[231721]: + . kolla_extend_start Feb 23 04:27:50 localhost nova_compute[231721]: Running command: 'nova-compute' Feb 23 04:27:50 localhost nova_compute[231721]: + echo 'Running command: '\''nova-compute'\''' Feb 23 04:27:50 localhost nova_compute[231721]: + umask 0022 Feb 23 04:27:50 localhost nova_compute[231721]: + exec nova-compute Feb 23 04:27:52 localhost python3.9[231843]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 23 04:27:52 localhost systemd[1]: Started libpod-conmon-29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa.scope. Feb 23 04:27:52 localhost systemd[1]: Started libcrun container. Feb 23 04:27:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Feb 23 04:27:52 localhost nova_compute[231721]: 2026-02-23 09:27:52.380 231725 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:27:52 localhost nova_compute[231721]: 2026-02-23 09:27:52.381 231725 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:27:52 localhost nova_compute[231721]: 2026-02-23 09:27:52.381 231725 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:27:52 localhost nova_compute[231721]: 2026-02-23 09:27:52.381 231725 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 23 04:27:52 localhost podman[231864]: 2026-02-23 09:27:52.38883004 +0000 UTC m=+0.170688110 container init 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute_init, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:27:52 localhost podman[231864]: 2026-02-23 09:27:52.397083756 +0000 UTC m=+0.178941826 container start 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=nova_compute_init, container_name=nova_compute_init, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:27:52 localhost python3.9[231843]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Applying nova statedir ownership Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/ Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b already 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b to system_u:object_r:container_file_t:s0 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/console.log Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/b81db1e2a8e54083d8c4b030cc59287a706969ae Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-b81db1e2a8e54083d8c4b030cc59287a706969ae Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9 Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/f23138a46bc477ec40b895db4322b27384fbb01ccd8da7395c9877132dfb82af Feb 23 04:27:52 localhost nova_compute_init[231887]: INFO:nova_statedir:Nova statedir ownership complete Feb 23 04:27:52 localhost systemd[1]: libpod-29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa.scope: Deactivated successfully. Feb 23 04:27:52 localhost podman[231888]: 2026-02-23 09:27:52.468033155 +0000 UTC m=+0.053111067 container died 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=nova_compute_init, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.43.0) Feb 23 04:27:52 localhost nova_compute[231721]: 2026-02-23 09:27:52.501 231725 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:27:52 localhost nova_compute[231721]: 2026-02-23 09:27:52.531 231725 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:27:52 localhost nova_compute[231721]: 2026-02-23 09:27:52.532 231725 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 23 04:27:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa-userdata-shm.mount: Deactivated successfully. Feb 23 04:27:52 localhost systemd[1]: var-lib-containers-storage-overlay-f44d9e7d68ca1accba5abc072a966a93a3cfaed75061df003916b61d6be8a5d6-merged.mount: Deactivated successfully. Feb 23 04:27:52 localhost podman[231899]: 2026-02-23 09:27:52.583069601 +0000 UTC m=+0.112912613 container cleanup 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=nova_compute_init, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:27:52 localhost systemd[1]: libpod-conmon-29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa.scope: Deactivated successfully. Feb 23 04:27:52 localhost nova_compute[231721]: 2026-02-23 09:27:52.924 231725 INFO nova.virt.driver [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.053 231725 INFO nova.compute.provider_config [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.060 231725 WARNING nova.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.061 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.062 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.063 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] console_host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.064 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.065 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.066 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.067 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.068 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.069 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.070 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.071 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.072 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.073 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.074 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.075 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.076 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.077 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.078 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.079 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.080 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.081 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.082 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.083 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.084 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.085 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.086 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.087 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.088 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.089 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.090 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.091 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.092 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.093 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.094 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.095 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.096 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.097 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.098 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.099 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.100 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.101 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.102 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.103 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.104 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.105 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.106 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.107 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.108 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.109 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.110 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.111 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.112 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.112 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.112 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.112 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.113 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.114 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.115 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.116 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.barbican_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.117 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.118 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.119 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.120 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.121 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.122 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.123 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.124 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.125 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.126 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.127 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.128 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.129 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.130 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.130 231725 WARNING oslo_config.cfg [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 23 04:27:53 localhost nova_compute[231721]: live_migration_uri is deprecated for removal in favor of two other options that Feb 23 04:27:53 localhost nova_compute[231721]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 23 04:27:53 localhost nova_compute[231721]: and ``live_migration_inbound_addr`` respectively. Feb 23 04:27:53 localhost nova_compute[231721]: ). Its value may be silently ignored in the future.#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.131 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.132 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_secret_uuid = f1fea371-cb69-578d-a3d0-b5c472a84b46 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.133 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.134 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.135 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.136 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.137 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.138 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.139 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.140 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.141 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.142 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.143 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.144 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.145 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.146 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.147 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.148 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.149 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.150 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.151 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.152 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.153 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.154 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.155 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.156 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.157 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.158 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.159 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.160 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.161 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.162 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.163 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.164 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.165 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.166 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.167 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.168 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.169 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.170 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.171 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.172 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.173 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.174 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.175 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.176 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.177 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.178 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.179 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.180 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.181 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.182 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.183 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.184 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.185 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.186 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.187 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.188 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.189 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.190 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.190 231725 DEBUG oslo_service.service [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.191 231725 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.209 231725 INFO nova.virt.node [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.210 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.210 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.210 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.211 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.225 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.227 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.228 231725 INFO nova.virt.libvirt.driver [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Connection event '1' reason 'None'#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.235 231725 INFO nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host capabilities Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: bdcaa433-cfc7-450a-99ab-f0985ab59447 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: x86_64 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v4 Feb 23 04:27:53 localhost nova_compute[231721]: AMD Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: tcp Feb 23 04:27:53 localhost nova_compute[231721]: rdma Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 16116612 Feb 23 04:27:53 localhost nova_compute[231721]: 4029153 Feb 23 04:27:53 localhost nova_compute[231721]: 0 Feb 23 04:27:53 localhost nova_compute[231721]: 0 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: selinux Feb 23 04:27:53 localhost nova_compute[231721]: 0 Feb 23 04:27:53 localhost nova_compute[231721]: system_u:system_r:svirt_t:s0 Feb 23 04:27:53 localhost nova_compute[231721]: system_u:system_r:svirt_tcg_t:s0 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: dac Feb 23 04:27:53 localhost nova_compute[231721]: 0 Feb 23 04:27:53 localhost nova_compute[231721]: +107:+107 Feb 23 04:27:53 localhost nova_compute[231721]: +107:+107 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: hvm Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 32 Feb 23 04:27:53 localhost nova_compute[231721]: /usr/libexec/qemu-kvm Feb 23 04:27:53 localhost nova_compute[231721]: pc-i440fx-rhel7.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.8.0 Feb 23 04:27:53 localhost nova_compute[231721]: q35 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.4.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.5.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.3.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel7.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.4.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.2.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.2.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.0.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.0.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.1.0 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: hvm Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 64 Feb 23 04:27:53 localhost nova_compute[231721]: /usr/libexec/qemu-kvm Feb 23 04:27:53 localhost nova_compute[231721]: pc-i440fx-rhel7.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.8.0 Feb 23 04:27:53 localhost nova_compute[231721]: q35 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.4.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.5.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.3.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel7.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.4.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.2.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.2.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.0.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.0.0 Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel8.1.0 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: #033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.246 231725 DEBUG nova.virt.libvirt.volume.mount [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.247 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.253 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: /usr/libexec/qemu-kvm Feb 23 04:27:53 localhost nova_compute[231721]: kvm Feb 23 04:27:53 localhost nova_compute[231721]: pc-i440fx-rhel7.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: i686 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: rom Feb 23 04:27:53 localhost nova_compute[231721]: pflash Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: yes Feb 23 04:27:53 localhost nova_compute[231721]: no Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: no Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome Feb 23 04:27:53 localhost nova_compute[231721]: AMD Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 486 Feb 23 04:27:53 localhost nova_compute[231721]: 486-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ClearwaterForest Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ClearwaterForest-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Conroe Feb 23 04:27:53 localhost nova_compute[231721]: Conroe-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-IBPB Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v4 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v5 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Turin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Turin-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v1 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v2 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v6 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v7 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: KnightsMill Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: KnightsMill-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G1-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G2 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G2-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G3 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G3-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G4-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G5-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Penryn Feb 23 04:27:53 localhost nova_compute[231721]: Penryn-v1 Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Westmere Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-v2 Feb 23 04:27:53 localhost nova_compute[231721]: athlon Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: athlon-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: core2duo Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: core2duo-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: coreduo Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: coreduo-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: kvm32 Feb 23 04:27:53 localhost nova_compute[231721]: kvm32-v1 Feb 23 04:27:53 localhost nova_compute[231721]: kvm64 Feb 23 04:27:53 localhost nova_compute[231721]: kvm64-v1 Feb 23 04:27:53 localhost nova_compute[231721]: n270 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: n270-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: pentium Feb 23 04:27:53 localhost nova_compute[231721]: pentium-v1 Feb 23 04:27:53 localhost nova_compute[231721]: pentium2 Feb 23 04:27:53 localhost nova_compute[231721]: pentium2-v1 Feb 23 04:27:53 localhost nova_compute[231721]: pentium3 Feb 23 04:27:53 localhost nova_compute[231721]: pentium3-v1 Feb 23 04:27:53 localhost nova_compute[231721]: phenom Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: phenom-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: qemu32 Feb 23 04:27:53 localhost nova_compute[231721]: qemu32-v1 Feb 23 04:27:53 localhost nova_compute[231721]: qemu64 Feb 23 04:27:53 localhost nova_compute[231721]: qemu64-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: file Feb 23 04:27:53 localhost nova_compute[231721]: anonymous Feb 23 04:27:53 localhost nova_compute[231721]: memfd Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: disk Feb 23 04:27:53 localhost nova_compute[231721]: cdrom Feb 23 04:27:53 localhost nova_compute[231721]: floppy Feb 23 04:27:53 localhost nova_compute[231721]: lun Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ide Feb 23 04:27:53 localhost nova_compute[231721]: fdc Feb 23 04:27:53 localhost nova_compute[231721]: scsi Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: sata Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: virtio-transitional Feb 23 04:27:53 localhost nova_compute[231721]: virtio-non-transitional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: vnc Feb 23 04:27:53 localhost nova_compute[231721]: egl-headless Feb 23 04:27:53 localhost nova_compute[231721]: dbus Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: subsystem Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: default Feb 23 04:27:53 localhost nova_compute[231721]: mandatory Feb 23 04:27:53 localhost nova_compute[231721]: requisite Feb 23 04:27:53 localhost nova_compute[231721]: optional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: pci Feb 23 04:27:53 localhost nova_compute[231721]: scsi Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: virtio-transitional Feb 23 04:27:53 localhost nova_compute[231721]: virtio-non-transitional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: random Feb 23 04:27:53 localhost nova_compute[231721]: egd Feb 23 04:27:53 localhost nova_compute[231721]: builtin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: path Feb 23 04:27:53 localhost nova_compute[231721]: handle Feb 23 04:27:53 localhost nova_compute[231721]: virtiofs Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: tpm-tis Feb 23 04:27:53 localhost nova_compute[231721]: tpm-crb Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: emulator Feb 23 04:27:53 localhost nova_compute[231721]: external Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 2.0 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: pty Feb 23 04:27:53 localhost nova_compute[231721]: unix Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: qemu Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: builtin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: default Feb 23 04:27:53 localhost nova_compute[231721]: passt Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: isa Feb 23 04:27:53 localhost nova_compute[231721]: hyperv Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: null Feb 23 04:27:53 localhost nova_compute[231721]: vc Feb 23 04:27:53 localhost nova_compute[231721]: pty Feb 23 04:27:53 localhost nova_compute[231721]: dev Feb 23 04:27:53 localhost nova_compute[231721]: file Feb 23 04:27:53 localhost nova_compute[231721]: pipe Feb 23 04:27:53 localhost nova_compute[231721]: stdio Feb 23 04:27:53 localhost nova_compute[231721]: udp Feb 23 04:27:53 localhost nova_compute[231721]: tcp Feb 23 04:27:53 localhost nova_compute[231721]: unix Feb 23 04:27:53 localhost nova_compute[231721]: qemu-vdagent Feb 23 04:27:53 localhost nova_compute[231721]: dbus Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: relaxed Feb 23 04:27:53 localhost nova_compute[231721]: vapic Feb 23 04:27:53 localhost nova_compute[231721]: spinlocks Feb 23 04:27:53 localhost nova_compute[231721]: vpindex Feb 23 04:27:53 localhost nova_compute[231721]: runtime Feb 23 04:27:53 localhost nova_compute[231721]: synic Feb 23 04:27:53 localhost nova_compute[231721]: stimer Feb 23 04:27:53 localhost nova_compute[231721]: reset Feb 23 04:27:53 localhost nova_compute[231721]: vendor_id Feb 23 04:27:53 localhost nova_compute[231721]: frequencies Feb 23 04:27:53 localhost nova_compute[231721]: reenlightenment Feb 23 04:27:53 localhost nova_compute[231721]: tlbflush Feb 23 04:27:53 localhost nova_compute[231721]: ipi Feb 23 04:27:53 localhost nova_compute[231721]: avic Feb 23 04:27:53 localhost nova_compute[231721]: emsr_bitmap Feb 23 04:27:53 localhost nova_compute[231721]: xmm_input Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 4095 Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Linux KVM Hv Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.264 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: /usr/libexec/qemu-kvm Feb 23 04:27:53 localhost nova_compute[231721]: kvm Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.8.0 Feb 23 04:27:53 localhost nova_compute[231721]: i686 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: rom Feb 23 04:27:53 localhost nova_compute[231721]: pflash Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: yes Feb 23 04:27:53 localhost nova_compute[231721]: no Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: no Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome Feb 23 04:27:53 localhost nova_compute[231721]: AMD Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 486 Feb 23 04:27:53 localhost nova_compute[231721]: 486-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ClearwaterForest Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ClearwaterForest-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Conroe Feb 23 04:27:53 localhost nova_compute[231721]: Conroe-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-IBPB Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v4 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v5 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Turin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Turin-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v1 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v2 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v6 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v7 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: KnightsMill Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: KnightsMill-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G1-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G2 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G2-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G3 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G3-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G4-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G5-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Penryn Feb 23 04:27:53 localhost nova_compute[231721]: Penryn-v1 Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Westmere Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-v2 Feb 23 04:27:53 localhost nova_compute[231721]: athlon Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: athlon-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: core2duo Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: core2duo-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: coreduo Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: coreduo-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: kvm32 Feb 23 04:27:53 localhost nova_compute[231721]: kvm32-v1 Feb 23 04:27:53 localhost nova_compute[231721]: kvm64 Feb 23 04:27:53 localhost nova_compute[231721]: kvm64-v1 Feb 23 04:27:53 localhost nova_compute[231721]: n270 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: n270-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: pentium Feb 23 04:27:53 localhost nova_compute[231721]: pentium-v1 Feb 23 04:27:53 localhost nova_compute[231721]: pentium2 Feb 23 04:27:53 localhost nova_compute[231721]: pentium2-v1 Feb 23 04:27:53 localhost nova_compute[231721]: pentium3 Feb 23 04:27:53 localhost nova_compute[231721]: pentium3-v1 Feb 23 04:27:53 localhost nova_compute[231721]: phenom Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: phenom-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: qemu32 Feb 23 04:27:53 localhost nova_compute[231721]: qemu32-v1 Feb 23 04:27:53 localhost nova_compute[231721]: qemu64 Feb 23 04:27:53 localhost nova_compute[231721]: qemu64-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: file Feb 23 04:27:53 localhost nova_compute[231721]: anonymous Feb 23 04:27:53 localhost nova_compute[231721]: memfd Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: disk Feb 23 04:27:53 localhost nova_compute[231721]: cdrom Feb 23 04:27:53 localhost nova_compute[231721]: floppy Feb 23 04:27:53 localhost nova_compute[231721]: lun Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: fdc Feb 23 04:27:53 localhost nova_compute[231721]: scsi Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: sata Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: virtio-transitional Feb 23 04:27:53 localhost nova_compute[231721]: virtio-non-transitional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: vnc Feb 23 04:27:53 localhost nova_compute[231721]: egl-headless Feb 23 04:27:53 localhost nova_compute[231721]: dbus Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: subsystem Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: default Feb 23 04:27:53 localhost nova_compute[231721]: mandatory Feb 23 04:27:53 localhost nova_compute[231721]: requisite Feb 23 04:27:53 localhost nova_compute[231721]: optional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: pci Feb 23 04:27:53 localhost nova_compute[231721]: scsi Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: virtio-transitional Feb 23 04:27:53 localhost nova_compute[231721]: virtio-non-transitional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: random Feb 23 04:27:53 localhost nova_compute[231721]: egd Feb 23 04:27:53 localhost nova_compute[231721]: builtin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: path Feb 23 04:27:53 localhost nova_compute[231721]: handle Feb 23 04:27:53 localhost nova_compute[231721]: virtiofs Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: tpm-tis Feb 23 04:27:53 localhost nova_compute[231721]: tpm-crb Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: emulator Feb 23 04:27:53 localhost nova_compute[231721]: external Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 2.0 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: pty Feb 23 04:27:53 localhost nova_compute[231721]: unix Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: qemu Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: builtin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: default Feb 23 04:27:53 localhost nova_compute[231721]: passt Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: isa Feb 23 04:27:53 localhost nova_compute[231721]: hyperv Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: null Feb 23 04:27:53 localhost nova_compute[231721]: vc Feb 23 04:27:53 localhost nova_compute[231721]: pty Feb 23 04:27:53 localhost nova_compute[231721]: dev Feb 23 04:27:53 localhost nova_compute[231721]: file Feb 23 04:27:53 localhost nova_compute[231721]: pipe Feb 23 04:27:53 localhost nova_compute[231721]: stdio Feb 23 04:27:53 localhost nova_compute[231721]: udp Feb 23 04:27:53 localhost nova_compute[231721]: tcp Feb 23 04:27:53 localhost nova_compute[231721]: unix Feb 23 04:27:53 localhost nova_compute[231721]: qemu-vdagent Feb 23 04:27:53 localhost nova_compute[231721]: dbus Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: relaxed Feb 23 04:27:53 localhost nova_compute[231721]: vapic Feb 23 04:27:53 localhost nova_compute[231721]: spinlocks Feb 23 04:27:53 localhost nova_compute[231721]: vpindex Feb 23 04:27:53 localhost nova_compute[231721]: runtime Feb 23 04:27:53 localhost nova_compute[231721]: synic Feb 23 04:27:53 localhost nova_compute[231721]: stimer Feb 23 04:27:53 localhost nova_compute[231721]: reset Feb 23 04:27:53 localhost nova_compute[231721]: vendor_id Feb 23 04:27:53 localhost nova_compute[231721]: frequencies Feb 23 04:27:53 localhost nova_compute[231721]: reenlightenment Feb 23 04:27:53 localhost nova_compute[231721]: tlbflush Feb 23 04:27:53 localhost nova_compute[231721]: ipi Feb 23 04:27:53 localhost nova_compute[231721]: avic Feb 23 04:27:53 localhost nova_compute[231721]: emsr_bitmap Feb 23 04:27:53 localhost nova_compute[231721]: xmm_input Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 4095 Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Linux KVM Hv Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.318 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.325 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: /usr/libexec/qemu-kvm Feb 23 04:27:53 localhost nova_compute[231721]: kvm Feb 23 04:27:53 localhost nova_compute[231721]: pc-i440fx-rhel7.6.0 Feb 23 04:27:53 localhost nova_compute[231721]: x86_64 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: rom Feb 23 04:27:53 localhost nova_compute[231721]: pflash Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: yes Feb 23 04:27:53 localhost nova_compute[231721]: no Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: no Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome Feb 23 04:27:53 localhost nova_compute[231721]: AMD Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 486 Feb 23 04:27:53 localhost nova_compute[231721]: 486-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ClearwaterForest Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ClearwaterForest-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Conroe Feb 23 04:27:53 localhost nova_compute[231721]: Conroe-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-IBPB Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v4 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v5 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Turin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Turin-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v1 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v2 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v6 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v7 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: KnightsMill Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: KnightsMill-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G1-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G2 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G2-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G3 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G3-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G4-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G5-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Penryn Feb 23 04:27:53 localhost nova_compute[231721]: Penryn-v1 Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SierraForest-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Client-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Skylake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Snowridge-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Westmere Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Westmere-v2 Feb 23 04:27:53 localhost nova_compute[231721]: athlon Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: athlon-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: core2duo Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: core2duo-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: coreduo Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: coreduo-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: kvm32 Feb 23 04:27:53 localhost nova_compute[231721]: kvm32-v1 Feb 23 04:27:53 localhost nova_compute[231721]: kvm64 Feb 23 04:27:53 localhost nova_compute[231721]: kvm64-v1 Feb 23 04:27:53 localhost nova_compute[231721]: n270 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: n270-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: pentium Feb 23 04:27:53 localhost nova_compute[231721]: pentium-v1 Feb 23 04:27:53 localhost nova_compute[231721]: pentium2 Feb 23 04:27:53 localhost nova_compute[231721]: pentium2-v1 Feb 23 04:27:53 localhost nova_compute[231721]: pentium3 Feb 23 04:27:53 localhost nova_compute[231721]: pentium3-v1 Feb 23 04:27:53 localhost nova_compute[231721]: phenom Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: phenom-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: qemu32 Feb 23 04:27:53 localhost nova_compute[231721]: qemu32-v1 Feb 23 04:27:53 localhost nova_compute[231721]: qemu64 Feb 23 04:27:53 localhost nova_compute[231721]: qemu64-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: file Feb 23 04:27:53 localhost nova_compute[231721]: anonymous Feb 23 04:27:53 localhost nova_compute[231721]: memfd Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: disk Feb 23 04:27:53 localhost nova_compute[231721]: cdrom Feb 23 04:27:53 localhost nova_compute[231721]: floppy Feb 23 04:27:53 localhost nova_compute[231721]: lun Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ide Feb 23 04:27:53 localhost nova_compute[231721]: fdc Feb 23 04:27:53 localhost nova_compute[231721]: scsi Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: sata Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: virtio-transitional Feb 23 04:27:53 localhost nova_compute[231721]: virtio-non-transitional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: vnc Feb 23 04:27:53 localhost nova_compute[231721]: egl-headless Feb 23 04:27:53 localhost nova_compute[231721]: dbus Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: subsystem Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: default Feb 23 04:27:53 localhost nova_compute[231721]: mandatory Feb 23 04:27:53 localhost nova_compute[231721]: requisite Feb 23 04:27:53 localhost nova_compute[231721]: optional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: pci Feb 23 04:27:53 localhost nova_compute[231721]: scsi Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: virtio Feb 23 04:27:53 localhost nova_compute[231721]: virtio-transitional Feb 23 04:27:53 localhost nova_compute[231721]: virtio-non-transitional Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: random Feb 23 04:27:53 localhost nova_compute[231721]: egd Feb 23 04:27:53 localhost nova_compute[231721]: builtin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: path Feb 23 04:27:53 localhost nova_compute[231721]: handle Feb 23 04:27:53 localhost nova_compute[231721]: virtiofs Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: tpm-tis Feb 23 04:27:53 localhost nova_compute[231721]: tpm-crb Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: emulator Feb 23 04:27:53 localhost nova_compute[231721]: external Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 2.0 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: usb Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: pty Feb 23 04:27:53 localhost nova_compute[231721]: unix Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: qemu Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: builtin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: default Feb 23 04:27:53 localhost nova_compute[231721]: passt Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: isa Feb 23 04:27:53 localhost nova_compute[231721]: hyperv Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: null Feb 23 04:27:53 localhost nova_compute[231721]: vc Feb 23 04:27:53 localhost nova_compute[231721]: pty Feb 23 04:27:53 localhost nova_compute[231721]: dev Feb 23 04:27:53 localhost nova_compute[231721]: file Feb 23 04:27:53 localhost nova_compute[231721]: pipe Feb 23 04:27:53 localhost nova_compute[231721]: stdio Feb 23 04:27:53 localhost nova_compute[231721]: udp Feb 23 04:27:53 localhost nova_compute[231721]: tcp Feb 23 04:27:53 localhost nova_compute[231721]: unix Feb 23 04:27:53 localhost nova_compute[231721]: qemu-vdagent Feb 23 04:27:53 localhost nova_compute[231721]: dbus Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: relaxed Feb 23 04:27:53 localhost nova_compute[231721]: vapic Feb 23 04:27:53 localhost nova_compute[231721]: spinlocks Feb 23 04:27:53 localhost nova_compute[231721]: vpindex Feb 23 04:27:53 localhost nova_compute[231721]: runtime Feb 23 04:27:53 localhost nova_compute[231721]: synic Feb 23 04:27:53 localhost nova_compute[231721]: stimer Feb 23 04:27:53 localhost nova_compute[231721]: reset Feb 23 04:27:53 localhost nova_compute[231721]: vendor_id Feb 23 04:27:53 localhost nova_compute[231721]: frequencies Feb 23 04:27:53 localhost nova_compute[231721]: reenlightenment Feb 23 04:27:53 localhost nova_compute[231721]: tlbflush Feb 23 04:27:53 localhost nova_compute[231721]: ipi Feb 23 04:27:53 localhost nova_compute[231721]: avic Feb 23 04:27:53 localhost nova_compute[231721]: emsr_bitmap Feb 23 04:27:53 localhost nova_compute[231721]: xmm_input Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 4095 Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Linux KVM Hv Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:27:53 localhost nova_compute[231721]: 2026-02-23 09:27:53.399 231725 DEBUG nova.virt.libvirt.host [None req-a9e5d3d9-9357-4ba9-96c4-7ac54478153f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: /usr/libexec/qemu-kvm Feb 23 04:27:53 localhost nova_compute[231721]: kvm Feb 23 04:27:53 localhost nova_compute[231721]: pc-q35-rhel9.8.0 Feb 23 04:27:53 localhost nova_compute[231721]: x86_64 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: efi Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 23 04:27:53 localhost nova_compute[231721]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 23 04:27:53 localhost nova_compute[231721]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 23 04:27:53 localhost nova_compute[231721]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: rom Feb 23 04:27:53 localhost nova_compute[231721]: pflash Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: yes Feb 23 04:27:53 localhost nova_compute[231721]: no Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: yes Feb 23 04:27:53 localhost nova_compute[231721]: no Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: on Feb 23 04:27:53 localhost nova_compute[231721]: off Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome Feb 23 04:27:53 localhost nova_compute[231721]: AMD Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: 486 Feb 23 04:27:53 localhost nova_compute[231721]: 486-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Broadwell-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cascadelake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ClearwaterForest Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: ClearwaterForest-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Conroe Feb 23 04:27:53 localhost nova_compute[231721]: Conroe-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Cooperlake-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Denverton-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Dhyana-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Genoa-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-IBPB Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Milan-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v4 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Rome-v5 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Turin Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-Turin-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v1 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v2 Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: EPYC-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: GraniteRapids-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-noTSX-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Haswell-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-noTSX Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v3 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v6 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Icelake-Server-v7 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: IvyBridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: KnightsMill Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: KnightsMill-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Nehalem-v2 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G1-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G2 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G2-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G3 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G3-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G4 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G4-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G5 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Opteron_G5-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Penryn Feb 23 04:27:53 localhost nova_compute[231721]: Penryn-v1 Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-IBRS Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-v1 Feb 23 04:27:53 localhost nova_compute[231721]: SandyBridge-v2 Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: SapphireRapids-v1 Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:27:53 localhost nova_compute[231721]: Feb 23 04:31:01 localhost nova_compute[231721]: 2026-02-23 09:31:01.685 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:31:01 localhost nova_compute[231721]: 2026-02-23 09:31:01.692 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:31:01 localhost nova_compute[231721]: 2026-02-23 09:31:01.708 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:31:01 localhost nova_compute[231721]: 2026-02-23 09:31:01.710 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:31:01 localhost nova_compute[231721]: 2026-02-23 09:31:01.710 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:31:01 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:02 localhost systemd[1]: libpod-conmon-bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.scope: Deactivated successfully. Feb 23 04:31:02 localhost rsyslogd[758]: imjournal: 5038 messages lost due to rate-limiting (20000 allowed within 600 seconds) Feb 23 04:31:02 localhost python3.9[248181]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:31:02 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:02 localhost python3.9[248291]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman Feb 23 04:31:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8255 DF PROTO=TCP SPT=53622 DPT=9100 SEQ=2225582548 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5E1860000000001030307) Feb 23 04:31:03 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 23 04:31:03 localhost systemd[1]: var-lib-containers-storage-overlay-025f13926bcfeedaf7e085dde432a5009541a3e067b7031c72f0d516a81ad107-merged.mount: Deactivated successfully. Feb 23 04:31:03 localhost nova_compute[231721]: 2026-02-23 09:31:03.779 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:04 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:31:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:31:04 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 23 04:31:04 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 23 04:31:04 localhost podman[248303]: 2026-02-23 09:31:04.799807878 +0000 UTC m=+0.159450016 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Feb 23 04:31:04 localhost podman[248303]: 2026-02-23 09:31:04.835257119 +0000 UTC m=+0.194899257 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Feb 23 04:31:05 localhost nova_compute[231721]: 2026-02-23 09:31:05.227 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:05 localhost python3.9[248429]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 23 04:31:06 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:06 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:31:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13504 DF PROTO=TCP SPT=40444 DPT=9101 SEQ=3526832977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5EE860000000001030307) Feb 23 04:31:06 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:31:06 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:31:06 localhost systemd[1]: Started libpod-conmon-da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.scope. Feb 23 04:31:06 localhost podman[248430]: 2026-02-23 09:31:06.403691428 +0000 UTC m=+0.895375364 container exec da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:31:06 localhost podman[248430]: 2026-02-23 09:31:06.432896566 +0000 UTC m=+0.924580512 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:31:07 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:07 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:07 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:08 localhost python3.9[248568]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 23 04:31:08 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:08 localhost systemd[1]: libpod-conmon-da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.scope: Deactivated successfully. Feb 23 04:31:08 localhost systemd[1]: Started libpod-conmon-da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.scope. Feb 23 04:31:08 localhost podman[248569]: 2026-02-23 09:31:08.292528934 +0000 UTC m=+0.102106103 container exec da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:31:08 localhost podman[248569]: 2026-02-23 09:31:08.326329974 +0000 UTC m=+0.135907133 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:31:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:31:08 localhost podman[248598]: 2026-02-23 09:31:08.616312356 +0000 UTC m=+0.189423757 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:31:08 localhost podman[248598]: 2026-02-23 09:31:08.628712841 +0000 UTC m=+0.201824292 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:31:08 localhost podman[248598]: unhealthy Feb 23 04:31:08 localhost nova_compute[231721]: 2026-02-23 09:31:08.810 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:09 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:09 localhost python3.9[248729]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:31:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60658 DF PROTO=TCP SPT=56642 DPT=9882 SEQ=2548069702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF5FAC30000000001030307) Feb 23 04:31:09 localhost systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully. Feb 23 04:31:09 localhost systemd[1]: var-lib-containers-storage-overlay-af4f3c6c37d07fdae63d07a90ecbe84180ed2951bbcccf4f59b147e3f8b29057-merged.mount: Deactivated successfully. Feb 23 04:31:09 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:31:09 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'. Feb 23 04:31:09 localhost systemd[1]: libpod-conmon-da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.scope: Deactivated successfully. Feb 23 04:31:09 localhost python3.9[248839]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman Feb 23 04:31:10 localhost nova_compute[231721]: 2026-02-23 09:31:10.265 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:10 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 23 04:31:10 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 23 04:31:10 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 23 04:31:11 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 23 04:31:11 localhost python3.9[248962]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 23 04:31:11 localhost systemd[1]: Started libpod-conmon-6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.scope. Feb 23 04:31:11 localhost podman[248963]: 2026-02-23 09:31:11.796600364 +0000 UTC m=+0.095305962 container exec 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., release=1770267347) Feb 23 04:31:11 localhost podman[248963]: 2026-02-23 09:31:11.826510594 +0000 UTC m=+0.125216212 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:31:12 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 23 04:31:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60660 DF PROTO=TCP SPT=56642 DPT=9882 SEQ=2548069702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF606C60000000001030307) Feb 23 04:31:12 localhost python3.9[249101]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None Feb 23 04:31:13 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:31:13 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 23 04:31:13 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 23 04:31:13 localhost nova_compute[231721]: 2026-02-23 09:31:13.864 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:13 localhost systemd[1]: libpod-conmon-6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.scope: Deactivated successfully. Feb 23 04:31:13 localhost systemd[1]: Started libpod-conmon-6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.scope. Feb 23 04:31:13 localhost podman[249102]: 2026-02-23 09:31:13.937984039 +0000 UTC m=+1.016423487 container exec 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, distribution-scope=public, version=9.7, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:31:13 localhost podman[249102]: 2026-02-23 09:31:13.970187769 +0000 UTC m=+1.048627237 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-type=git, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public) Feb 23 04:31:15 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:15 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:31:15 localhost nova_compute[231721]: 2026-02-23 09:31:15.306 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22202 DF PROTO=TCP SPT=35286 DPT=9105 SEQ=2453412937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF612060000000001030307) Feb 23 04:31:15 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:31:16 localhost python3.9[249239]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:31:16 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:16 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:16 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:16 localhost systemd[1]: libpod-conmon-6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.scope: Deactivated successfully. Feb 23 04:31:17 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:17 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:17 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13506 DF PROTO=TCP SPT=40444 DPT=9101 SEQ=3526832977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF61E070000000001030307) Feb 23 04:31:18 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 23 04:31:18 localhost systemd[1]: var-lib-containers-storage-overlay-3e3fe691531a0d3ed4e0bd844aee95e09028b37c75ae9985cc1386696cb9ad2a-merged.mount: Deactivated successfully. Feb 23 04:31:18 localhost nova_compute[231721]: 2026-02-23 09:31:18.914 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:19 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:19 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 23 04:31:20 localhost nova_compute[231721]: 2026-02-23 09:31:20.352 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:20 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:20 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:20 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:20 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:21 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 23 04:31:21 localhost systemd[1]: var-lib-containers-storage-overlay-912153276d119d62292ee43dc157a09b9029351ec12b42dcebd4c826260b5572-merged.mount: Deactivated successfully. Feb 23 04:31:22 localhost systemd[1]: var-lib-containers-storage-overlay-f3afd1cf5e6198a170887a65c5f10af446afae7f60b1c2348209fc3be458dddf-merged.mount: Deactivated successfully. Feb 23 04:31:22 localhost systemd[1]: var-lib-containers-storage-overlay-9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3-merged.mount: Deactivated successfully. Feb 23 04:31:22 localhost systemd[1]: var-lib-containers-storage-overlay-9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3-merged.mount: Deactivated successfully. Feb 23 04:31:22 localhost systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully. Feb 23 04:31:22 localhost systemd[1]: var-lib-containers-storage-overlay-f3afd1cf5e6198a170887a65c5f10af446afae7f60b1c2348209fc3be458dddf-merged.mount: Deactivated successfully. Feb 23 04:31:23 localhost systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully. Feb 23 04:31:23 localhost systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully. Feb 23 04:31:23 localhost nova_compute[231721]: 2026-02-23 09:31:23.942 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23613 DF PROTO=TCP SPT=39128 DPT=9102 SEQ=3298852228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF634400000000001030307) Feb 23 04:31:24 localhost systemd[1]: var-lib-containers-storage-overlay-9ebf51f80a46e835820a271b66c56bf3153d0ad4226e954d9a4e5952244e92d3-merged.mount: Deactivated successfully. Feb 23 04:31:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60662 DF PROTO=TCP SPT=56642 DPT=9882 SEQ=2548069702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF636060000000001030307) Feb 23 04:31:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:31:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:31:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:31:24 localhost podman[249258]: 2026-02-23 09:31:24.912749016 +0000 UTC m=+0.082689570 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:31:24 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 23 04:31:24 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 23 04:31:24 localhost podman[249258]: 2026-02-23 09:31:24.966265714 +0000 UTC m=+0.136206268 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller) Feb 23 04:31:25 localhost podman[249257]: 2026-02-23 09:31:24.96612806 +0000 UTC m=+0.137152698 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347) Feb 23 04:31:25 localhost podman[249257]: 2026-02-23 09:31:25.059242394 +0000 UTC m=+0.230267042 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container) Feb 23 04:31:25 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:31:25 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:31:25 localhost podman[249259]: 2026-02-23 09:31:25.301733836 +0000 UTC m=+0.467982609 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:31:25 localhost podman[249259]: 2026-02-23 09:31:25.31178698 +0000 UTC m=+0.478035773 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:31:25 localhost nova_compute[231721]: 2026-02-23 09:31:25.354 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:25 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:31:26 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 23 04:31:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23615 DF PROTO=TCP SPT=39128 DPT=9102 SEQ=3298852228 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF640460000000001030307) Feb 23 04:31:27 localhost sshd[249323]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:31:28 localhost systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully. Feb 23 04:31:28 localhost systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully. Feb 23 04:31:28 localhost systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully. Feb 23 04:31:28 localhost nova_compute[231721]: 2026-02-23 09:31:28.994 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:30 localhost nova_compute[231721]: 2026-02-23 09:31:30.389 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:30 localhost systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully. Feb 23 04:31:30 localhost systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully. Feb 23 04:31:30 localhost systemd[1]: var-lib-containers-storage-overlay-239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111-merged.mount: Deactivated successfully. Feb 23 04:31:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50002 DF PROTO=TCP SPT=40690 DPT=9100 SEQ=4076589671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF64EC60000000001030307) Feb 23 04:31:30 localhost sshd[249325]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:31:31 localhost systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully. Feb 23 04:31:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:31:31 localhost systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully. Feb 23 04:31:31 localhost podman[249326]: 2026-02-23 09:31:31.561847008 +0000 UTC m=+0.084181016 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:31:31 localhost podman[249326]: 2026-02-23 09:31:31.575210161 +0000 UTC m=+0.097544179 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Feb 23 04:31:31 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:31:32 localhost systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully. Feb 23 04:31:32 localhost systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully. Feb 23 04:31:32 localhost systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully. Feb 23 04:31:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50003 DF PROTO=TCP SPT=40690 DPT=9100 SEQ=4076589671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF656C60000000001030307) Feb 23 04:31:33 localhost nova_compute[231721]: 2026-02-23 09:31:33.995 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:34 localhost systemd[1]: var-lib-containers-storage-overlay-7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c-merged.mount: Deactivated successfully. Feb 23 04:31:35 localhost nova_compute[231721]: 2026-02-23 09:31:35.421 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28380 DF PROTO=TCP SPT=60904 DPT=9101 SEQ=1954125877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF663860000000001030307) Feb 23 04:31:36 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:31:36 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:31:36 localhost podman[249346]: 2026-02-23 09:31:36.736848226 +0000 UTC m=+0.084705442 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible) Feb 23 04:31:36 localhost podman[249346]: 2026-02-23 09:31:36.771310167 +0000 UTC m=+0.119167363 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent) Feb 23 04:31:36 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:31:37 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:31:38 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:38 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:31:38 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:31:39 localhost nova_compute[231721]: 2026-02-23 09:31:39.011 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35897 DF PROTO=TCP SPT=54996 DPT=9882 SEQ=2209706486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF66FF20000000001030307) Feb 23 04:31:39 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:39 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:39 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:31:40 localhost nova_compute[231721]: 2026-02-23 09:31:40.425 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:40 localhost podman[249449]: 2026-02-23 09:31:40.449600427 +0000 UTC m=+0.083696951 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:31:40 localhost podman[249449]: 2026-02-23 09:31:40.493762243 +0000 UTC m=+0.127858787 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:31:40 localhost podman[249449]: unhealthy Feb 23 04:31:40 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:31:40 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'. Feb 23 04:31:40 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:40 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:40 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35899 DF PROTO=TCP SPT=54996 DPT=9882 SEQ=2209706486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF67C060000000001030307) Feb 23 04:31:43 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:31:43 localhost systemd[1]: var-lib-containers-storage-overlay-49d558d6cd3227e44d6cf362abc8b50d968f0fb79b74496dd7e1499d728668a7-merged.mount: Deactivated successfully. Feb 23 04:31:44 localhost nova_compute[231721]: 2026-02-23 09:31:44.044 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:44 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 23 04:31:44 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:44 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:44 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:45 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50005 DF PROTO=TCP SPT=40690 DPT=9100 SEQ=4076589671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF686060000000001030307) Feb 23 04:31:45 localhost nova_compute[231721]: 2026-02-23 09:31:45.471 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:45 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 23 04:31:45 localhost systemd[1]: var-lib-containers-storage-overlay-abedacbc22b2fa6d036e7ebf1118c866641128a14eaf11021e9356d60564993d-merged.mount: Deactivated successfully. Feb 23 04:31:46 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:46 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 23 04:31:46 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 23 04:31:47 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:47 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:47 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:48 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 23 04:31:48 localhost systemd[1]: var-lib-containers-storage-overlay-6de1ea423bd6d005b3c98cfa65644155837a90a885b901ec0c789a8fa4573360-merged.mount: Deactivated successfully. Feb 23 04:31:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:31:48.532 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:31:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:31:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:31:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:31:48.534 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:31:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28382 DF PROTO=TCP SPT=60904 DPT=9101 SEQ=1954125877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF694060000000001030307) Feb 23 04:31:49 localhost nova_compute[231721]: 2026-02-23 09:31:49.084 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:49 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 23 04:31:49 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 23 04:31:49 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 23 04:31:50 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:50 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 23 04:31:50 localhost systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully. Feb 23 04:31:50 localhost nova_compute[231721]: 2026-02-23 09:31:50.515 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:50 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:50 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:50 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:51 localhost systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully. Feb 23 04:31:51 localhost systemd[1]: var-lib-containers-storage-overlay-4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b-merged.mount: Deactivated successfully. Feb 23 04:31:52 localhost systemd[1]: var-lib-containers-storage-overlay-4603fc849c2ecb1a2dd39fe5f99a90015995e0b99d1b206aafaed4ee8a276f7b-merged.mount: Deactivated successfully. Feb 23 04:31:54 localhost nova_compute[231721]: 2026-02-23 09:31:54.122 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33650 DF PROTO=TCP SPT=54694 DPT=9102 SEQ=3463304677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6A99F0000000001030307) Feb 23 04:31:54 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:31:54 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:31:54 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:31:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35901 DF PROTO=TCP SPT=54996 DPT=9882 SEQ=2209706486 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6AC060000000001030307) Feb 23 04:31:55 localhost nova_compute[231721]: 2026-02-23 09:31:55.560 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:31:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:31:55 localhost systemd[1]: tmp-crun.LvZYfc.mount: Deactivated successfully. Feb 23 04:31:55 localhost podman[249472]: 2026-02-23 09:31:55.929137465 +0000 UTC m=+0.102093567 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-type=git, vendor=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 23 04:31:55 localhost podman[249472]: 2026-02-23 09:31:55.970134565 +0000 UTC m=+0.143090697 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, distribution-scope=public, io.openshift.expose-services=, version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git) Feb 23 04:31:55 localhost podman[249474]: 2026-02-23 09:31:55.984782977 +0000 UTC m=+0.148413377 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:31:55 localhost podman[249474]: 2026-02-23 09:31:55.995185492 +0000 UTC m=+0.158815902 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:31:56 localhost systemd[1]: tmp-crun.nd1vyX.mount: Deactivated successfully. Feb 23 04:31:56 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:56 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:31:56 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:31:56 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:31:56 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:31:56 localhost podman[249473]: 2026-02-23 09:31:56.57765929 +0000 UTC m=+0.746875259 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:31:56 localhost podman[249473]: 2026-02-23 09:31:56.659213307 +0000 UTC m=+0.828429246 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:31:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33652 DF PROTO=TCP SPT=54694 DPT=9102 SEQ=3463304677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6B5C60000000001030307) Feb 23 04:31:57 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:57 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:57 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:31:57 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:31:58 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:31:58 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:58 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:31:59 localhost nova_compute[231721]: 2026-02-23 09:31:59.170 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:31:59 localhost nova_compute[231721]: 2026-02-23 09:31:59.810 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:31:59 localhost nova_compute[231721]: 2026-02-23 09:31:59.810 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:31:59 localhost nova_compute[231721]: 2026-02-23 09:31:59.833 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:31:59 localhost nova_compute[231721]: 2026-02-23 09:31:59.833 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:31:59 localhost nova_compute[231721]: 2026-02-23 09:31:59.833 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.247 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.248 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.248 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.248 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.598 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.641 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.655 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.655 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.655 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.656 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.656 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.657 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.657 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.657 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.657 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.658 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.672 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.672 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.673 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.673 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:32:00 localhost nova_compute[231721]: 2026-02-23 09:32:00.674 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:32:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7066 DF PROTO=TCP SPT=54430 DPT=9100 SEQ=2103503740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6C4060000000001030307) Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.122 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.194 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.194 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.392 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.393 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12414MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.394 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.394 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.470 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.471 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.471 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.499 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:32:01 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:32:01 localhost systemd[1]: var-lib-containers-storage-overlay-c3d90fa516ab23da2bd722ed78a874566a7daf8e8d3d852895f80962cb5a1d59-merged.mount: Deactivated successfully. Feb 23 04:32:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:32:01 localhost systemd[1]: tmp-crun.tCDynR.mount: Deactivated successfully. Feb 23 04:32:01 localhost podman[249582]: 2026-02-23 09:32:01.81588939 +0000 UTC m=+0.085423133 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute) Feb 23 04:32:01 localhost podman[249582]: 2026-02-23 09:32:01.853305802 +0000 UTC m=+0.122839595 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:32:01 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.965 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.972 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.994 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.996 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:32:01 localhost nova_compute[231721]: 2026-02-23 09:32:01.997 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:32:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7067 DF PROTO=TCP SPT=54430 DPT=9100 SEQ=2103503740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6CC060000000001030307) Feb 23 04:32:03 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:03 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 23 04:32:03 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 23 04:32:04 localhost nova_compute[231721]: 2026-02-23 09:32:04.177 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:04 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:04 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:04 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:05 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:05 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:05 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:05 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:05 localhost nova_compute[231721]: 2026-02-23 09:32:05.645 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22691 DF PROTO=TCP SPT=36906 DPT=9101 SEQ=2180461920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6D8C60000000001030307) Feb 23 04:32:06 localhost systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully. Feb 23 04:32:06 localhost sshd[249603]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:32:06 localhost systemd[1]: var-lib-containers-storage-overlay-bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1-merged.mount: Deactivated successfully. Feb 23 04:32:06 localhost systemd[1]: var-lib-containers-storage-overlay-bf00c918822c143f438250923a86f39afe39c46ee0adc9fcf99ac7bc5e8117c1-merged.mount: Deactivated successfully. Feb 23 04:32:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:32:07 localhost podman[249605]: 2026-02-23 09:32:07.064473582 +0000 UTC m=+0.082085322 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:32:07 localhost podman[249605]: 2026-02-23 09:32:07.073290489 +0000 UTC m=+0.090902229 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Feb 23 04:32:08 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:32:08 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 23 04:32:08 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 23 04:32:08 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:32:09 localhost nova_compute[231721]: 2026-02-23 09:32:09.215 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27808 DF PROTO=TCP SPT=52920 DPT=9882 SEQ=132694475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6E5230000000001030307) Feb 23 04:32:10 localhost sshd[249623]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:32:10 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:10 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:32:10 localhost systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully. Feb 23 04:32:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:32:10 localhost nova_compute[231721]: 2026-02-23 09:32:10.679 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:10 localhost podman[249625]: 2026-02-23 09:32:10.724281763 +0000 UTC m=+0.105672955 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:32:10 localhost podman[249625]: 2026-02-23 09:32:10.76021701 +0000 UTC m=+0.141608192 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:32:10 localhost podman[249625]: unhealthy Feb 23 04:32:11 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:11 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:11 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:11 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:32:11 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'. Feb 23 04:32:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27810 DF PROTO=TCP SPT=52920 DPT=9882 SEQ=132694475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6F1460000000001030307) Feb 23 04:32:12 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:12 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:12 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:14 localhost nova_compute[231721]: 2026-02-23 09:32:14.262 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:14 localhost systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully. Feb 23 04:32:14 localhost systemd[1]: var-lib-containers-storage-overlay-51915910ced93426f00f1704499e6c4900ce6f68bf275b1a1584b9abaa73dcbc-merged.mount: Deactivated successfully. Feb 23 04:32:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7069 DF PROTO=TCP SPT=54430 DPT=9100 SEQ=2103503740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF6FC060000000001030307) Feb 23 04:32:15 localhost nova_compute[231721]: 2026-02-23 09:32:15.720 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:16 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:17 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:32:17 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:32:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22693 DF PROTO=TCP SPT=36906 DPT=9101 SEQ=2180461920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF708060000000001030307) Feb 23 04:32:19 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:19 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:19 localhost nova_compute[231721]: 2026-02-23 09:32:19.298 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:19 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:20 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:20 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:20 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:20 localhost nova_compute[231721]: 2026-02-23 09:32:20.767 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:21 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:21 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:21 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:21 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Feb 23 04:32:21 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Feb 23 04:32:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21394 DF PROTO=TCP SPT=34830 DPT=9102 SEQ=3082614443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF71EA00000000001030307) Feb 23 04:32:24 localhost nova_compute[231721]: 2026-02-23 09:32:24.333 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:24 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:32:24 localhost systemd[1]: var-lib-containers-storage-overlay-7e96575c95037d7184ed74bc1e793aa507f3bf187550011550ade6ddf34aa4ff-merged.mount: Deactivated successfully. Feb 23 04:32:24 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Feb 23 04:32:24 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Feb 23 04:32:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27812 DF PROTO=TCP SPT=52920 DPT=9882 SEQ=132694475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF722070000000001030307) Feb 23 04:32:25 localhost nova_compute[231721]: 2026-02-23 09:32:25.802 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:32:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:32:26 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:26 localhost systemd[1]: tmp-crun.7IT0kB.mount: Deactivated successfully. Feb 23 04:32:26 localhost podman[249649]: 2026-02-23 09:32:26.961394259 +0000 UTC m=+0.123475270 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:32:26 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:32:27 localhost podman[249648]: 2026-02-23 09:32:27.006719497 +0000 UTC m=+0.179703371 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter) Feb 23 04:32:27 localhost podman[249648]: 2026-02-23 09:32:27.017891609 +0000 UTC m=+0.190875503 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1770267347, version=9.7, architecture=x86_64, maintainer=Red Hat, Inc.) Feb 23 04:32:27 localhost podman[249649]: 2026-02-23 09:32:27.10131409 +0000 UTC m=+0.263395091 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:32:27 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:32:27 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:32:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21396 DF PROTO=TCP SPT=34830 DPT=9102 SEQ=3082614443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF72AC60000000001030307) Feb 23 04:32:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:32:27 localhost podman[249689]: 2026-02-23 09:32:27.907982074 +0000 UTC m=+0.079566877 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:32:28 localhost podman[249689]: 2026-02-23 09:32:28.00541041 +0000 UTC m=+0.176995233 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 23 04:32:29 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:29 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:29 localhost nova_compute[231721]: 2026-02-23 09:32:29.377 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:29 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:29 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:32:30 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:30 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:30 localhost nova_compute[231721]: 2026-02-23 09:32:30.853 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:30 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41882 DF PROTO=TCP SPT=49340 DPT=9100 SEQ=2011130389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF739070000000001030307) Feb 23 04:32:30 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:31 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:31 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:31 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:32:32 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:32 localhost podman[249712]: 2026-02-23 09:32:32.166234179 +0000 UTC m=+0.092610470 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:32:32 localhost podman[249712]: 2026-02-23 09:32:32.178547318 +0000 UTC m=+0.104923659 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 23 04:32:32 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:32:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41883 DF PROTO=TCP SPT=49340 DPT=9100 SEQ=2011130389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF741060000000001030307) Feb 23 04:32:33 localhost systemd[1]: tmp-crun.u3EDEU.mount: Deactivated successfully. Feb 23 04:32:33 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:34 localhost nova_compute[231721]: 2026-02-23 09:32:34.415 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-f1c03aa8e256d3d38d275b9e911c2e9e69db76da13bfae548890815046fc902e-merged.mount: Deactivated successfully. Feb 23 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-f1c03aa8e256d3d38d275b9e911c2e9e69db76da13bfae548890815046fc902e-merged.mount: Deactivated successfully. Feb 23 04:32:35 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:35 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 23 04:32:35 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 23 04:32:35 localhost nova_compute[231721]: 2026-02-23 09:32:35.883 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:36 localhost systemd[1]: session-55.scope: Deactivated successfully. Feb 23 04:32:36 localhost systemd[1]: session-55.scope: Consumed 1min 21.309s CPU time. Feb 23 04:32:36 localhost systemd-logind[759]: Session 55 logged out. Waiting for processes to exit. Feb 23 04:32:36 localhost systemd-logind[759]: Removed session 55. Feb 23 04:32:36 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43571 DF PROTO=TCP SPT=57646 DPT=9101 SEQ=956909870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF74E060000000001030307) Feb 23 04:32:36 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:36 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:36 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:36 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:37 localhost systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully. Feb 23 04:32:37 localhost systemd[1]: var-lib-containers-storage-overlay-b2c770567d2f47629c218ae90d489529d9f3e3ed2618072d59a3365c20854653-merged.mount: Deactivated successfully. Feb 23 04:32:37 localhost systemd[1]: var-lib-containers-storage-overlay-b2c770567d2f47629c218ae90d489529d9f3e3ed2618072d59a3365c20854653-merged.mount: Deactivated successfully. Feb 23 04:32:38 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 23 04:32:39 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:32:39 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:39 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:39 localhost podman[249731]: 2026-02-23 09:32:39.109628964 +0000 UTC m=+0.075818847 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0) Feb 23 04:32:39 localhost podman[249731]: 2026-02-23 09:32:39.118097028 +0000 UTC m=+0.084286921 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:32:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21398 DF PROTO=TCP SPT=34830 DPT=9102 SEQ=3082614443 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF75A070000000001030307) Feb 23 04:32:39 localhost nova_compute[231721]: 2026-02-23 09:32:39.455 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:39 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:32:40 localhost systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully. Feb 23 04:32:40 localhost systemd[1]: var-lib-containers-storage-overlay-df719217e40f9ffd193139cfaeeaaebcf46866a6b616db04d8d1f0793e86d521-merged.mount: Deactivated successfully. Feb 23 04:32:40 localhost systemd[1]: var-lib-containers-storage-overlay-df719217e40f9ffd193139cfaeeaaebcf46866a6b616db04d8d1f0793e86d521-merged.mount: Deactivated successfully. Feb 23 04:32:40 localhost nova_compute[231721]: 2026-02-23 09:32:40.926 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:42 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:32:42 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47585 DF PROTO=TCP SPT=37598 DPT=9882 SEQ=2607443554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF766460000000001030307) Feb 23 04:32:42 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:32:42 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:32:42 localhost podman[249798]: 2026-02-23 09:32:42.756333854 +0000 UTC m=+0.315619003 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:32:42 localhost podman[249798]: 2026-02-23 09:32:42.809463854 +0000 UTC m=+0.368748993 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:32:42 localhost podman[249798]: unhealthy Feb 23 04:32:44 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:44 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:44 localhost nova_compute[231721]: 2026-02-23 09:32:44.500 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:44 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:32:44 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:32:44 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'. Feb 23 04:32:44 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63532 DF PROTO=TCP SPT=45642 DPT=9105 SEQ=2572149185 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF770060000000001030307) Feb 23 04:32:45 localhost sshd[249878]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:32:45 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:45 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:45 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:45 localhost nova_compute[231721]: 2026-02-23 09:32:45.971 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:46 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:32:46 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:46 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:32:48 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43573 DF PROTO=TCP SPT=57646 DPT=9101 SEQ=956909870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF77E060000000001030307) Feb 23 04:32:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:32:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:32:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:32:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:32:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:32:48.534 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:32:49 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:32:49 localhost systemd[1]: var-lib-containers-storage-overlay-f56bdf141506d099102e067531f1fbfb1e40a67a70799f11577fc9c27fb9f83a-merged.mount: Deactivated successfully. Feb 23 04:32:49 localhost nova_compute[231721]: 2026-02-23 09:32:49.529 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:50 localhost systemd[1]: var-lib-containers-storage-overlay-4e7c8cf8be5e28661f08c7ae9ca08b0a811b1f296a0663a493871b4299da2d4e-merged.mount: Deactivated successfully. Feb 23 04:32:51 localhost nova_compute[231721]: 2026-02-23 09:32:51.023 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:51 localhost systemd[1]: var-lib-containers-storage-overlay-1962cc6363cc9ac3ab3c2a513bdaec43a309cd0406c08aa2e9112851ab244998-merged.mount: Deactivated successfully. Feb 23 04:32:51 localhost systemd[1]: var-lib-containers-storage-overlay-1962cc6363cc9ac3ab3c2a513bdaec43a309cd0406c08aa2e9112851ab244998-merged.mount: Deactivated successfully. Feb 23 04:32:52 localhost sshd[249917]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:32:52 localhost systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully. Feb 23 04:32:52 localhost systemd[1]: var-lib-containers-storage-overlay-4e7c8cf8be5e28661f08c7ae9ca08b0a811b1f296a0663a493871b4299da2d4e-merged.mount: Deactivated successfully. Feb 23 04:32:52 localhost systemd[1]: var-lib-containers-storage-overlay-4e7c8cf8be5e28661f08c7ae9ca08b0a811b1f296a0663a493871b4299da2d4e-merged.mount: Deactivated successfully. Feb 23 04:32:52 localhost nova_compute[231721]: 2026-02-23 09:32:52.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:52 localhost nova_compute[231721]: 2026-02-23 09:32:52.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 23 04:32:52 localhost nova_compute[231721]: 2026-02-23 09:32:52.582 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 23 04:32:52 localhost nova_compute[231721]: 2026-02-23 09:32:52.582 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:52 localhost nova_compute[231721]: 2026-02-23 09:32:52.583 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 23 04:32:52 localhost nova_compute[231721]: 2026-02-23 09:32:52.597 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:53 localhost systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully. Feb 23 04:32:53 localhost systemd[1]: var-lib-containers-storage-overlay-0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4-merged.mount: Deactivated successfully. Feb 23 04:32:53 localhost nova_compute[231721]: 2026-02-23 09:32:53.606 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:53 localhost systemd[1]: var-lib-containers-storage-overlay-d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d-merged.mount: Deactivated successfully. Feb 23 04:32:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11511 DF PROTO=TCP SPT=48682 DPT=9102 SEQ=41584801 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF793D00000000001030307) Feb 23 04:32:54 localhost systemd[1]: var-lib-containers-storage-overlay-882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3-merged.mount: Deactivated successfully. Feb 23 04:32:54 localhost nova_compute[231721]: 2026-02-23 09:32:54.568 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47587 DF PROTO=TCP SPT=37598 DPT=9882 SEQ=2607443554 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF796060000000001030307) Feb 23 04:32:55 localhost systemd[1]: var-lib-containers-storage-overlay-1962cc6363cc9ac3ab3c2a513bdaec43a309cd0406c08aa2e9112851ab244998-merged.mount: Deactivated successfully. Feb 23 04:32:55 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Feb 23 04:32:55 localhost systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully. Feb 23 04:32:55 localhost nova_compute[231721]: 2026-02-23 09:32:55.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:55 localhost nova_compute[231721]: 2026-02-23 09:32:55.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:32:56 localhost nova_compute[231721]: 2026-02-23 09:32:56.057 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.130 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.135 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82509c2f-e805-495c-861d-404e51ac72ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.131570', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a26e928c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': 'c47dba8b8c5f859c6255640b65821a2663ccad4f99bb5db509c377f5c9b7d3d6'}]}, 'timestamp': '2026-02-23 09:32:56.136255', '_unique_id': '63494af84f674ee8a5e3207e97cdffdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.137 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.138 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.160 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48089054-0a97-46f0-bfdf-d950556e8e39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:32:56.139143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a2725ce6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.349778366, 'message_signature': '5514cbcb4d670468e4180ee8b2cbec6c102b44ca5f8eb73f9e84e26a0fd0c611'}]}, 'timestamp': '2026-02-23 09:32:56.161085', '_unique_id': '835e2340c5814c3888d860587da24f11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.162 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.202 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65f28275-b33a-4d1f-9643-955c89427800', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.163636', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a278aa42-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': 'e4a6bf597e5f2b7cc0e7a95e30556a0e19fcba22d2a34c57bc54b7caa8c67f5f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.163636', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a278bd34-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '659b51142d252d643b231d2765c78311a316e6f0941c156127315111cd100f94'}]}, 'timestamp': '2026-02-23 09:32:56.202806', '_unique_id': 'd242a90df7844e3c88985212737276ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.204 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cc7f603-dc42-48cf-be05-5b4f1b446a1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.205457', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a279378c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': 'c83ac948f7216d8740034a359478e12e04342e27d098550790f0f65a08832960'}]}, 'timestamp': '2026-02-23 09:32:56.205983', '_unique_id': 'd0261587b5c34adaab474e946814affc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a4a0421-61b0-4f49-9313-9f38aa7ef032', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.208173', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a279a122-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': 'f77a18813211db7418c8142bd8d57bc97bee09f4662c7714b319de99364d0f6f'}]}, 'timestamp': '2026-02-23 09:32:56.208649', '_unique_id': '9b36e81f8d9e489ebb7ae90211c8ee26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.209 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98de4735-6ab7-441f-88ff-44f9fd59004f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.210897', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27a0b8a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '38c1d96aad64be8eff48d9e23ee943647c70fdbb584a4cb7ffe490029721a46e'}]}, 'timestamp': '2026-02-23 09:32:56.211374', '_unique_id': 'c0c4e640ed8045a184185de67f36946b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.212 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d9facf0-8fc0-47cc-95cf-5dc0bdf10a57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.215007', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27aaba8-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': 'b139823ebf670036d8325d49339fd15f039caf6dc9dcbfc34e0e71d486861f50'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.215007', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27abc4c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': 'ec90a83b9348925c91498b67313d1246074c0d28b7f123d2bdb387f7330ece75'}]}, 'timestamp': '2026-02-23 09:32:56.215972', '_unique_id': '0f2221d6624840f587c38c4acaa19f34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.216 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5d472a7-3797-4892-870b-f2a41f1dbbf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.218238', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27b2a24-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '5327072ff96b050139a8f2e4045b069461798c5395a10ae6190be45304bc8325'}]}, 'timestamp': '2026-02-23 09:32:56.218710', '_unique_id': '82d0e217d9c04e7cb3d3661eb9177fd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 54850000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46fe0a86-5686-48c1-8726-e9251a11e9fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54850000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:32:56.220850', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a27b90fe-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.349778366, 'message_signature': '4d982532e70ad717e08cf7161ad76e9a3dcb928f516ad548084f0c5f1222625f'}]}, 'timestamp': '2026-02-23 09:32:56.221387', '_unique_id': '7b25c7938bb04691a862e26955dbd069'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.222 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8fed521d-8876-4e23-b4d9-702b95028ba5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.222819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27bda32-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '9370494d269c337a833696da33e53f2316a4414631f51d323f481e7db5f0054c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.222819', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27be482-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '71a49b91119e58ff3dcb93d6441573d6b43c1445f9e52e2e39d0bc341771979a'}]}, 'timestamp': '2026-02-23 09:32:56.223373', '_unique_id': '250e6282ebb643c9b08f093e440e117b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fac153be-d491-4c6a-87c6-9c530827c254', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.225025', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27c3022-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '8c59fc56120c36a93fc7af745a1b57b3f4d31e5e1893d388e8269be0156b332b'}]}, 'timestamp': '2026-02-23 09:32:56.225346', '_unique_id': '72837f4c423d431ab39d1de48a05e1ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3518902b-941f-4268-b996-2dc983a73687', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.226721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27c71ae-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '0a7f005c30e1c3823a3f6f5f3a4aff25b4c93d4c1a3f47a268652f3788dbdf5a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.226721', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27c7f00-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '27554210fe8367c1f7e87fe4fab29b4c2cf2400df01efd0bedfe25fc65b8b99f'}]}, 'timestamp': '2026-02-23 09:32:56.227331', '_unique_id': '58802dacd1d847459381d3df8b842f43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02ab430f-bbc8-4a3e-a05e-75993a7f3ba1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.228893', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27cc712-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '08bea3778b74c74d39939389afe848543247de210a974f913bcf3d68e8921cf9'}]}, 'timestamp': '2026-02-23 09:32:56.229244', '_unique_id': '729f9553263046168c035e5e7fa51893'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c79dd5a8-3c65-489e-a26c-0e49333f99da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.230629', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27eb9b4-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '9c04206439ca020ec4362d22f4dff51e20818590131a0e1aeff1f24b8f96216d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.230629', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27ec77e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '100cb7115210309e076305fd67a0a0dbd066c58b55f5a6e131b55250a16ae8d8'}]}, 'timestamp': '2026-02-23 09:32:56.242298', '_unique_id': '2ac369556d6c4213b3cd70a7b6932231'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '318e6f0e-b0c1-4d2f-bb26-a1f4fa608f5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.243786', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27f0d60-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': 'f927672e800aed80538a7c9d9a1f9acab5ffe8a1a20af62aeef349a8d27cf99b'}]}, 'timestamp': '2026-02-23 09:32:56.244152', '_unique_id': 'aa9b7731cd3547c79f3b982859a5f1b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4726e68f-19a2-4c55-932d-36e28fb4c465', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.245566', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a27f51ee-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '6d0e3797b058f605dc23aeaf82ab79e60f7351331d2ffa836df5fe7cfec3bdd9'}]}, 'timestamp': '2026-02-23 09:32:56.245854', '_unique_id': 'c87de322572844efb22414db4d6baa4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91508e55-2e52-4700-be7c-4120bd10b950', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.247302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27f95be-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '78946f3cf075e985d5253d0f882d6a39446fe52966df0a15ca584b22d8e0d471'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.247302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27f9ff0-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '0e5fec729ccc81281612885db55eafe81772eae24529bebd390514a09ba1b04c'}]}, 'timestamp': '2026-02-23 09:32:56.247837', '_unique_id': '5db49799673e42f6a3caa4e407b1a6c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c12a4ca-dc99-4396-be35-991695851a84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.249426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a27fe898-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '542d2452a1ad0e2c06d66692588e3d75e17426ce3665022292db4fa65c56da1f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.249426', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a27ff2de-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '4e28dc28a09f491716ce0e67b3037c681589cfa4fbec4c34c6dd0c8192a690c7'}]}, 'timestamp': '2026-02-23 09:32:56.249972', '_unique_id': 'eee84fb3880d4b6fa777d98a2c5f0451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd4dcd49-4f6f-4b8d-8b9f-0e816bbe595f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9216, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:32:56.251420', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a28036ae-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.321053996, 'message_signature': '759dfeda47d57c7488cfc1a1934aa76726d12f4e881979c3d3747dbb88d5f38a'}]}, 'timestamp': '2026-02-23 09:32:56.251709', '_unique_id': '3bc5765445cd474684408787244f6a8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86c57273-93f4-45bc-ab8c-5150032ab6eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.253267', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a2808186-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '7488b9ed292b5f6a6af0f712ed2ad658a0f5c61a203ebb65800386e710b63119'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.253267', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a2808bb8-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.353123156, 'message_signature': '53942686608a4787b8f768d66bed00ba36cc45c52219b3b0ba1a6db6c3e3c664'}]}, 'timestamp': '2026-02-23 09:32:56.253883', '_unique_id': 'cb94a85990194bbdb98bebbba820c892'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aec2b2b-8ae6-45c1-a5c8-3fed6884de57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:32:56.255285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a280cfba-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '61b24fe090710890b0d007468b77dee1dd3ded04371e1295c75e10046c36b22f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:32:56.255285', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a280da28-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10616.420105705, 'message_signature': '20ca10a6e14d59b59e9b513e0b895df4e319bb058940124da8204ab2cdce2db5'}]}, 'timestamp': '2026-02-23 09:32:56.255897', '_unique_id': '2d993f22ced94bd28e6f4720b7c53dfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:32:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:32:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:32:56 localhost systemd[1]: var-lib-containers-storage-overlay-0438ade5aeea533b00cd75095bec75fbc2b307bace4c89bb39b75d428637bcd8-merged.mount: Deactivated successfully. Feb 23 04:32:56 localhost systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully. Feb 23 04:32:56 localhost nova_compute[231721]: 2026-02-23 09:32:56.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:56 localhost nova_compute[231721]: 2026-02-23 09:32:56.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11513 DF PROTO=TCP SPT=48682 DPT=9102 SEQ=41584801 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF79FC70000000001030307) Feb 23 04:32:57 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 23 04:32:57 localhost nova_compute[231721]: 2026-02-23 09:32:57.539 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:57 localhost nova_compute[231721]: 2026-02-23 09:32:57.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:32:57 localhost nova_compute[231721]: 2026-02-23 09:32:57.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:32:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:32:57 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 23 04:32:57 localhost podman[249919]: 2026-02-23 09:32:57.644250353 +0000 UTC m=+0.086703949 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, build-date=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal) Feb 23 04:32:57 localhost podman[249919]: 2026-02-23 09:32:57.65711673 +0000 UTC m=+0.099570246 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, name=ubi9/ubi-minimal) Feb 23 04:32:57 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 23 04:32:57 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:32:57 localhost podman[249920]: 2026-02-23 09:32:57.750013578 +0000 UTC m=+0.191416599 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:32:57 localhost podman[249920]: 2026-02-23 09:32:57.757322184 +0000 UTC m=+0.198725265 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:32:58 localhost nova_compute[231721]: 2026-02-23 09:32:58.391 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:32:58 localhost nova_compute[231721]: 2026-02-23 09:32:58.392 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:32:58 localhost nova_compute[231721]: 2026-02-23 09:32:58.392 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:32:58 localhost nova_compute[231721]: 2026-02-23 09:32:58.392 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:32:59 localhost nova_compute[231721]: 2026-02-23 09:32:59.418 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:32:59 localhost nova_compute[231721]: 2026-02-23 09:32:59.442 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:32:59 localhost nova_compute[231721]: 2026-02-23 09:32:59.443 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:32:59 localhost nova_compute[231721]: 2026-02-23 09:32:59.443 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:59 localhost nova_compute[231721]: 2026-02-23 09:32:59.443 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:32:59 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:32:59 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 23 04:32:59 localhost nova_compute[231721]: 2026-02-23 09:32:59.595 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:32:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:32:59 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 23 04:32:59 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:32:59 localhost podman[249960]: 2026-02-23 09:32:59.918246467 +0000 UTC m=+0.241020376 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:32:59 localhost podman[249960]: 2026-02-23 09:32:59.983659045 +0000 UTC m=+0.306432914 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:33:00 localhost sshd[249984]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:33:00 localhost systemd-logind[759]: New session 56 of user zuul. Feb 23 04:33:00 localhost systemd[1]: Started Session 56 of User zuul. Feb 23 04:33:00 localhost nova_compute[231721]: 2026-02-23 09:33:00.439 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:00 localhost nova_compute[231721]: 2026-02-23 09:33:00.539 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:00 localhost nova_compute[231721]: 2026-02-23 09:33:00.565 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:33:00 localhost nova_compute[231721]: 2026-02-23 09:33:00.565 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:33:00 localhost nova_compute[231721]: 2026-02-23 09:33:00.565 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:33:00 localhost nova_compute[231721]: 2026-02-23 09:33:00.566 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:33:00 localhost nova_compute[231721]: 2026-02-23 09:33:00.566 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:33:00 localhost python3.9[250080]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:00 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:00 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45735 DF PROTO=TCP SPT=60906 DPT=9100 SEQ=2260014654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7AE460000000001030307) Feb 23 04:33:00 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:00 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.100 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.119 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.182 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.183 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:33:01 localhost python3.9[250212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.364 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.366 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12483MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.367 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.368 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.501 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.501 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.502 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.553 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.621 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.621 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.641 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.665 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 04:33:01 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:01 localhost nova_compute[231721]: 2026-02-23 09:33:01.714 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:33:01 localhost python3.9[250300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839180.8847115-3714-46584600479295/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:01 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:02 localhost nova_compute[231721]: 2026-02-23 09:33:02.148 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:33:02 localhost nova_compute[231721]: 2026-02-23 09:33:02.151 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:33:02 localhost nova_compute[231721]: 2026-02-23 09:33:02.169 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:33:02 localhost nova_compute[231721]: 2026-02-23 09:33:02.172 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:33:02 localhost nova_compute[231721]: 2026-02-23 09:33:02.173 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.805s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:33:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:33:02 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:02 localhost podman[250432]: 2026-02-23 09:33:02.66217478 +0000 UTC m=+0.095026459 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Feb 23 04:33:02 localhost podman[250432]: 2026-02-23 09:33:02.699098685 +0000 UTC m=+0.131950334 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0) Feb 23 04:33:02 localhost python3.9[250433]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45736 DF PROTO=TCP SPT=60906 DPT=9100 SEQ=2260014654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7B6460000000001030307) Feb 23 04:33:03 localhost python3.9[250561]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:03 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 23 04:33:04 localhost systemd[1]: var-lib-containers-storage-overlay-eaf5c828e8984d86d81a6eee5a482e70c553115148192fac48b0718754776f54-merged.mount: Deactivated successfully. Feb 23 04:33:04 localhost python3.9[250618]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:04 localhost systemd[1]: var-lib-containers-storage-overlay-eaf5c828e8984d86d81a6eee5a482e70c553115148192fac48b0718754776f54-merged.mount: Deactivated successfully. Feb 23 04:33:04 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:33:04 localhost nova_compute[231721]: 2026-02-23 09:33:04.632 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:04 localhost python3.9[250728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:05 localhost python3.9[250785]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.0asdu0sn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:06 localhost nova_compute[231721]: 2026-02-23 09:33:06.144 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:06 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29526 DF PROTO=TCP SPT=35476 DPT=9101 SEQ=1186504611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7C3460000000001030307) Feb 23 04:33:06 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:06 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:06 localhost python3.9[250895]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:06 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:07 localhost python3.9[250952]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:08 localhost python3.9[251062]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:33:08 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:08 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:08 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37973 DF PROTO=TCP SPT=54942 DPT=9882 SEQ=2228898416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7CF830000000001030307) Feb 23 04:33:09 localhost python3[251173]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Feb 23 04:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:33:09 localhost nova_compute[231721]: 2026-02-23 09:33:09.662 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:09 localhost systemd[1]: tmp-crun.W7nxyE.mount: Deactivated successfully. Feb 23 04:33:09 localhost podman[251191]: 2026-02-23 09:33:09.682857747 +0000 UTC m=+0.109344953 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible) Feb 23 04:33:09 localhost podman[251191]: 2026-02-23 09:33:09.693383928 +0000 UTC m=+0.119871174 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:33:09 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:10 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:33:10 localhost python3.9[251300]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:10 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:10 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:10 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:10 localhost python3.9[251357]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:10 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:11 localhost nova_compute[231721]: 2026-02-23 09:33:11.192 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:11 localhost python3.9[251467]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:12 localhost python3.9[251524]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:12 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37975 DF PROTO=TCP SPT=54942 DPT=9882 SEQ=2228898416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7DB860000000001030307) Feb 23 04:33:12 localhost python3.9[251634]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:13 localhost python3.9[251691]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:13 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:13 localhost systemd[1]: var-lib-containers-storage-overlay-fda7ddd4426914a36b65b3677210da7055750d28e58e5eb1d0839c5cab6710a1-merged.mount: Deactivated successfully. Feb 23 04:33:13 localhost python3.9[251801]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:14 localhost python3.9[251858]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:14 localhost nova_compute[231721]: 2026-02-23 09:33:14.692 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:33:14 localhost podman[251951]: 2026-02-23 09:33:14.904638526 +0000 UTC m=+0.075673222 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:33:14 localhost podman[251951]: 2026-02-23 09:33:14.912539182 +0000 UTC m=+0.083573848 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:33:14 localhost podman[251951]: unhealthy Feb 23 04:33:15 localhost python3.9[251979]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:15 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45738 DF PROTO=TCP SPT=60906 DPT=9100 SEQ=2260014654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7E6060000000001030307) Feb 23 04:33:15 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:15 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:33:15 localhost python3.9[252079]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771839194.5352535-4089-178288335353392/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:16 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:33:16 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:33:16 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'. Feb 23 04:33:16 localhost nova_compute[231721]: 2026-02-23 09:33:16.234 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:16 localhost python3.9[252189]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:17 localhost python3.9[252299]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:33:17 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:17 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:17 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:18 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29528 DF PROTO=TCP SPT=35476 DPT=9101 SEQ=1186504611 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF7F4070000000001030307) Feb 23 04:33:18 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:18 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:19 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:19 localhost python3.9[252412]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:19 localhost nova_compute[231721]: 2026-02-23 09:33:19.736 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:19 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:19 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:19 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:20 localhost python3.9[252522]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:33:21 localhost nova_compute[231721]: 2026-02-23 09:33:21.270 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:21 localhost python3.9[252633]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:33:22 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:33:22 localhost python3.9[252745]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:33:22 localhost systemd[1]: var-lib-containers-storage-overlay-d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876-merged.mount: Deactivated successfully. Feb 23 04:33:22 localhost systemd[1]: var-lib-containers-storage-overlay-d7608d8b6f0c641f6e65bb7bd3e1d2a7040712e7934c1516102890a576b77876-merged.mount: Deactivated successfully. Feb 23 04:33:23 localhost python3.9[252858]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:23 localhost systemd[1]: session-56.scope: Deactivated successfully. Feb 23 04:33:23 localhost systemd[1]: session-56.scope: Consumed 12.634s CPU time. Feb 23 04:33:23 localhost systemd-logind[759]: Session 56 logged out. Waiting for processes to exit. Feb 23 04:33:23 localhost systemd-logind[759]: Removed session 56. Feb 23 04:33:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49492 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF809010000000001030307) Feb 23 04:33:24 localhost sshd[252876]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:33:24 localhost nova_compute[231721]: 2026-02-23 09:33:24.779 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:24 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:24 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49493 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF80D060000000001030307) Feb 23 04:33:25 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:26 localhost nova_compute[231721]: 2026-02-23 09:33:26.328 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:26 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:27 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:27 localhost sshd[252878]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:33:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49494 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF815060000000001030307) Feb 23 04:33:27 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:33:27 localhost podman[252880]: 2026-02-23 09:33:27.917640536 +0000 UTC m=+0.090084730 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:33:27 localhost podman[252880]: 2026-02-23 09:33:27.932134005 +0000 UTC m=+0.104578239 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z) Feb 23 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:28 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:28 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:33:28 localhost sshd[252900]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:33:28 localhost systemd-logind[759]: New session 57 of user zuul. Feb 23 04:33:28 localhost systemd[1]: Started Session 57 of User zuul. Feb 23 04:33:29 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:29 localhost python3.9[253013]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config/container-startup-config/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:29 localhost nova_compute[231721]: 2026-02-23 09:33:29.782 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:33:29 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:30 localhost systemd[1]: tmp-crun.77ZSRe.mount: Deactivated successfully. Feb 23 04:33:30 localhost podman[253068]: 2026-02-23 09:33:30.023073809 +0000 UTC m=+0.095454945 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:33:30 localhost podman[253068]: 2026-02-23 09:33:30.037993667 +0000 UTC m=+0.110374833 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:33:30 localhost python3.9[253146]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:30 localhost python3.9[253256]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49495 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF824C60000000001030307) Feb 23 04:33:31 localhost nova_compute[231721]: 2026-02-23 09:33:31.367 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:31 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:33:31 localhost systemd[1]: var-lib-containers-storage-overlay-e7e11ab6b4147a24f37e25dab2cf55bde3a4412e647a5968367e3a7c4331cac7-merged.mount: Deactivated successfully. Feb 23 04:33:31 localhost python3.9[253364]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:31 localhost systemd[1]: var-lib-containers-storage-overlay-e7e11ab6b4147a24f37e25dab2cf55bde3a4412e647a5968367e3a7c4331cac7-merged.mount: Deactivated successfully. Feb 23 04:33:31 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:33:31 localhost podman[253365]: 2026-02-23 09:33:31.950242857 +0000 UTC m=+0.245837157 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, config_id=ovn_controller, container_name=ovn_controller) Feb 23 04:33:31 localhost podman[253365]: 2026-02-23 09:33:31.990265798 +0000 UTC m=+0.285860078 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0) Feb 23 04:33:32 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:33:32 localhost python3.9[253475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839211.1572425-99-148851394594912/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:32 localhost systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully. Feb 23 04:33:32 localhost systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully. Feb 23 04:33:33 localhost python3.9[253583]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:33 localhost python3.9[253669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839212.6122553-99-114132814644498/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:34 localhost python3.9[253777]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:34 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:33:34 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:34 localhost podman[253847]: 2026-02-23 09:33:34.431113574 +0000 UTC m=+0.082869073 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute) Feb 23 04:33:34 localhost podman[253847]: 2026-02-23 09:33:34.441278389 +0000 UTC m=+0.093033908 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216) Feb 23 04:33:34 localhost python3.9[253871]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839213.7100668-99-105472800931003/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=840467536a035a46ebab3aa34cac1ebe80e50e31 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:34 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:33:34 localhost nova_compute[231721]: 2026-02-23 09:33:34.822 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:35 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:35 localhost python3.9[253991]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:36 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:36 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:36 localhost nova_compute[231721]: 2026-02-23 09:33:36.409 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:36 localhost python3.9[254077]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839215.3850489-273-115786416178742/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=d10c6f671263070bdc94fed977552f121764373c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:36 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:37 localhost python3.9[254185]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:33:37 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:37 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:37 localhost python3.9[254297]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:37 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:38 localhost python3.9[254407]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:38 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:38 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:38 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:39 localhost python3.9[254464]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49496 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF846060000000001030307) Feb 23 04:33:39 localhost nova_compute[231721]: 2026-02-23 09:33:39.860 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:39 localhost python3.9[254574]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:40 localhost python3.9[254631]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:33:40 localhost systemd[1]: tmp-crun.4aYnew.mount: Deactivated successfully. Feb 23 04:33:40 localhost podman[254632]: 2026-02-23 09:33:40.918849797 +0000 UTC m=+0.098030878 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent) Feb 23 04:33:40 localhost podman[254632]: 2026-02-23 09:33:40.951330456 +0000 UTC m=+0.130511547 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:33:41 localhost systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully. Feb 23 04:33:41 localhost systemd[1]: var-lib-containers-storage-overlay-e701559fdd80af17422acb214daf2f2ee3f38cde2d9b282e59bb97f69f05cdde-merged.mount: Deactivated successfully. Feb 23 04:33:41 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:33:41 localhost nova_compute[231721]: 2026-02-23 09:33:41.452 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:41 localhost python3.9[254759]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:42 localhost python3.9[254869]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:42 localhost python3.9[254926]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:43 localhost openstack_network_exporter[245358]: ERROR 09:33:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:33:43 localhost openstack_network_exporter[245358]: Feb 23 04:33:43 localhost openstack_network_exporter[245358]: ERROR 09:33:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:33:43 localhost openstack_network_exporter[245358]: Feb 23 04:33:43 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:43 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:33:43 localhost python3.9[255040]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:43 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:33:44 localhost python3.9[255097]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:44 localhost nova_compute[231721]: 2026-02-23 09:33:44.905 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:45 localhost python3.9[255207]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:33:45 localhost systemd[1]: Reloading. Feb 23 04:33:45 localhost systemd-sysv-generator[255236]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:33:45 localhost systemd-rc-local-generator[255230]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:45 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:45 localhost systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully. Feb 23 04:33:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:33:46 localhost podman[255354]: 2026-02-23 09:33:46.263889325 +0000 UTC m=+0.090285121 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:33:46 localhost podman[255354]: 2026-02-23 09:33:46.272783569 +0000 UTC m=+0.099179385 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:33:46 localhost podman[255354]: unhealthy Feb 23 04:33:46 localhost python3.9[255355]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:46 localhost nova_compute[231721]: 2026-02-23 09:33:46.503 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:46 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:46 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:46 localhost python3.9[255435]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:47 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:47 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Main process exited, code=exited, status=1/FAILURE Feb 23 04:33:47 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Failed with result 'exit-code'. Feb 23 04:33:47 localhost python3.9[255545]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:47 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:47 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:47 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:48 localhost python3.9[255602]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:33:48.533 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:33:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:33:48.534 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:33:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:33:48.536 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:33:48 localhost python3.9[255758]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:33:48 localhost systemd[1]: Reloading. Feb 23 04:33:48 localhost systemd-rc-local-generator[255781]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:33:48 localhost systemd-sysv-generator[255786]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:33:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:33:49 localhost systemd[1]: Starting Create netns directory... Feb 23 04:33:49 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 04:33:49 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 04:33:49 localhost systemd[1]: Finished Create netns directory. Feb 23 04:33:49 localhost nova_compute[231721]: 2026-02-23 09:33:49.964 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:50 localhost python3.9[255910]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:50 localhost systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully. Feb 23 04:33:50 localhost systemd[1]: var-lib-containers-storage-overlay-32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16-merged.mount: Deactivated successfully. Feb 23 04:33:50 localhost systemd[1]: var-lib-containers-storage-overlay-32124063214ed6a71bfdb162bed59d08d2309f70899d91e1af77aee73d927f16-merged.mount: Deactivated successfully. Feb 23 04:33:51 localhost python3.9[256022]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:33:51 localhost nova_compute[231721]: 2026-02-23 09:33:51.550 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:51 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:51 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 23 04:33:51 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 23 04:33:52 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:33:52 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:52 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:52 localhost python3.9[256149]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:33:53 localhost python3.9[256255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839232.3000507-708-67095203957197/.source.json _original_basename=.w2yffn9v follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:53 localhost systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully. Feb 23 04:33:53 localhost systemd[1]: var-lib-containers-storage-overlay-1a309f49ba9a6af0e1193f20a6ae2dd065eaab6a23a55dd0b287fffd33cd7437-merged.mount: Deactivated successfully. Feb 23 04:33:53 localhost systemd[1]: var-lib-containers-storage-overlay-1a309f49ba9a6af0e1193f20a6ae2dd065eaab6a23a55dd0b287fffd33cd7437-merged.mount: Deactivated successfully. Feb 23 04:33:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47049 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF87E300000000001030307) Feb 23 04:33:54 localhost python3.9[256363]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:33:55 localhost nova_compute[231721]: 2026-02-23 09:33:55.010 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47050 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF882470000000001030307) Feb 23 04:33:55 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 23 04:33:55 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 23 04:33:55 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 23 04:33:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49497 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF886070000000001030307) Feb 23 04:33:56 localhost nova_compute[231721]: 2026-02-23 09:33:56.596 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:33:56 localhost python3.9[256667]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False Feb 23 04:33:57 localhost nova_compute[231721]: 2026-02-23 09:33:57.174 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:57 localhost nova_compute[231721]: 2026-02-23 09:33:57.175 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:57 localhost nova_compute[231721]: 2026-02-23 09:33:57.175 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:33:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47051 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF88A460000000001030307) Feb 23 04:33:57 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:57 localhost nova_compute[231721]: 2026-02-23 09:33:57.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:57 localhost nova_compute[231721]: 2026-02-23 09:33:57.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:33:57 localhost nova_compute[231721]: 2026-02-23 09:33:57.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:33:57 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 23 04:33:57 localhost python3.9[256777]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 23 04:33:57 localhost systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully. Feb 23 04:33:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11517 DF PROTO=TCP SPT=48682 DPT=9102 SEQ=41584801 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF88E060000000001030307) Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.323 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.324 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.324 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.325 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:33:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:33:58 localhost systemd[1]: tmp-crun.yOUHvk.mount: Deactivated successfully. Feb 23 04:33:58 localhost podman[256888]: 2026-02-23 09:33:58.692807751 +0000 UTC m=+0.108716020 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.openshift.expose-services=, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container) Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.730 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:33:58 localhost podman[256888]: 2026-02-23 09:33:58.733478043 +0000 UTC m=+0.149386312 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, distribution-scope=public, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_id=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc.) Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.748 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.748 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.748 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.748 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:58 localhost nova_compute[231721]: 2026-02-23 09:33:58.749 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:58 localhost python3[256887]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json containers=['neutron_sriov_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 23 04:33:58 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:58 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:59 localhost systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully. Feb 23 04:33:59 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:33:59 localhost nova_compute[231721]: 2026-02-23 09:33:59.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:59 localhost nova_compute[231721]: 2026-02-23 09:33:59.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:59 localhost nova_compute[231721]: 2026-02-23 09:33:59.563 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:33:59 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:33:59 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:34:00 localhost nova_compute[231721]: 2026-02-23 09:34:00.013 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:00 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:34:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47052 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF89A070000000001030307) Feb 23 04:34:01 localhost sshd[256931]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:34:01 localhost nova_compute[231721]: 2026-02-23 09:34:01.600 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:02 localhost systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully. Feb 23 04:34:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:34:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:34:02 localhost podman[256934]: 2026-02-23 09:34:02.201601651 +0000 UTC m=+0.106128987 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:34:02 localhost podman[256934]: 2026-02-23 09:34:02.211207758 +0000 UTC m=+0.115735114 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:34:02 localhost podman[256933]: 2026-02-23 09:34:02.2975156 +0000 UTC m=+0.204204356 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS) Feb 23 04:34:02 localhost podman[256933]: 2026-02-23 09:34:02.354786432 +0000 UTC m=+0.261475258 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:34:02 localhost nova_compute[231721]: 2026-02-23 09:34:02.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:34:02 localhost nova_compute[231721]: 2026-02-23 09:34:02.555 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:34:02 localhost nova_compute[231721]: 2026-02-23 09:34:02.556 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:34:02 localhost nova_compute[231721]: 2026-02-23 09:34:02.556 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:34:02 localhost nova_compute[231721]: 2026-02-23 09:34:02.556 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:34:02 localhost nova_compute[231721]: 2026-02-23 09:34:02.557 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:34:02 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:34:02 localhost podman[256975]: Feb 23 04:34:02 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:34:02 localhost podman[256975]: 2026-02-23 09:34:02.951086463 +0000 UTC m=+0.722801850 container create 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:34:02 localhost podman[256975]: 2026-02-23 09:34:02.922494548 +0000 UTC m=+0.694209955 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.009 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:34:03 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.070 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.071 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.248 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.249 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12370MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.249 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.250 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.323 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.323 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.323 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.358 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:34:03 localhost systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully. Feb 23 04:34:03 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:34:03 localhost sshd[257042]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:34:03 localhost systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully. Feb 23 04:34:03 localhost python3[256887]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.809 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.814 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.836 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.839 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:34:03 localhost nova_compute[231721]: 2026-02-23 09:34:03.839 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:34:04 localhost systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully. Feb 23 04:34:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:34:04 localhost systemd[1]: var-lib-containers-storage-overlay-72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d-merged.mount: Deactivated successfully. Feb 23 04:34:04 localhost systemd[1]: var-lib-containers-storage-overlay-72bff2249ea9ee03825bd3e8fa07150769abcfe162fde9078852b16a351c2e6d-merged.mount: Deactivated successfully. Feb 23 04:34:04 localhost podman[257059]: 2026-02-23 09:34:04.842829928 +0000 UTC m=+0.104991001 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute) Feb 23 04:34:04 localhost podman[257059]: 2026-02-23 09:34:04.879475891 +0000 UTC m=+0.141636944 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 04:34:05 localhost nova_compute[231721]: 2026-02-23 09:34:05.043 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:05 localhost systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully. Feb 23 04:34:05 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 23 04:34:05 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 23 04:34:05 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:34:06 localhost nova_compute[231721]: 2026-02-23 09:34:06.634 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:07 localhost systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully. Feb 23 04:34:07 localhost podman[242954]: @ - - [23/Feb/2026:09:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 143381 "" "Go-http-client/1.1" Feb 23 04:34:07 localhost podman_exporter[242941]: ts=2026-02-23T09:34:07.254Z caller=exporter.go:96 level=info msg="Listening on" address=:9882 Feb 23 04:34:07 localhost podman_exporter[242941]: ts=2026-02-23T09:34:07.255Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882 Feb 23 04:34:07 localhost podman_exporter[242941]: ts=2026-02-23T09:34:07.255Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882 Feb 23 04:34:07 localhost python3.9[257198]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:34:08 localhost python3.9[257311]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:34:09 localhost python3.9[257366]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:34:09 localhost podman[242954]: time="2026-02-23T09:34:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:34:09 localhost podman[242954]: @ - - [23/Feb/2026:09:34:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147334 "" "Go-http-client/1.1" Feb 23 04:34:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47053 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8BA060000000001030307) Feb 23 04:34:09 localhost podman[242954]: @ - - [23/Feb/2026:09:34:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16227 "" "Go-http-client/1.1" Feb 23 04:34:09 localhost python3.9[257478]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771839249.0685031-942-199366317834695/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:34:10 localhost nova_compute[231721]: 2026-02-23 09:34:10.073 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:10 localhost python3.9[257533]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:34:10 localhost systemd[1]: Reloading. Feb 23 04:34:10 localhost systemd-rc-local-generator[257560]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:34:10 localhost systemd-sysv-generator[257564]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost python3.9[257624]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:34:11 localhost systemd[1]: Reloading. Feb 23 04:34:11 localhost systemd-sysv-generator[257656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:34:11 localhost systemd-rc-local-generator[257653]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:34:11 localhost nova_compute[231721]: 2026-02-23 09:34:11.668 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:34:11 localhost systemd[1]: Starting neutron_sriov_agent container... Feb 23 04:34:11 localhost podman[257663]: 2026-02-23 09:34:11.798702621 +0000 UTC m=+0.084607479 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent) Feb 23 04:34:11 localhost podman[257663]: 2026-02-23 09:34:11.807225934 +0000 UTC m=+0.093130792 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:34:11 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:34:11 localhost systemd[1]: Started libcrun container. Feb 23 04:34:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51924fe3f0717b37533a02cc6477f7f6558c10a39f81b159d32c6a1d48606037/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 23 04:34:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51924fe3f0717b37533a02cc6477f7f6558c10a39f81b159d32c6a1d48606037/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:34:11 localhost podman[257670]: 2026-02-23 09:34:11.901315364 +0000 UTC m=+0.176227060 container init 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=neutron_sriov_agent, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:34:11 localhost podman[257670]: 2026-02-23 09:34:11.911114048 +0000 UTC m=+0.186025744 container start 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.43.0, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:34:11 localhost podman[257670]: neutron_sriov_agent Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: + sudo -E kolla_set_configs Feb 23 04:34:11 localhost systemd[1]: Started neutron_sriov_agent container. Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Validating config file Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Copying service configuration files Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Writing out command to execute Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy Feb 23 04:34:11 localhost neutron_sriov_agent[257694]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: ++ cat /run_command Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: + CMD=/usr/bin/neutron-sriov-nic-agent Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: + ARGS= Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: + sudo kolla_copy_cacerts Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: + [[ ! -n '' ]] Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: + . kolla_extend_start Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: Running command: '/usr/bin/neutron-sriov-nic-agent' Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: + umask 0022 Feb 23 04:34:12 localhost neutron_sriov_agent[257694]: + exec /usr/bin/neutron-sriov-nic-agent Feb 23 04:34:13 localhost openstack_network_exporter[245358]: ERROR 09:34:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:34:13 localhost openstack_network_exporter[245358]: Feb 23 04:34:13 localhost openstack_network_exporter[245358]: ERROR 09:34:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:34:13 localhost openstack_network_exporter[245358]: Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.518 2 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.519 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005626463.localdomain'}#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.520 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] RPC agent_id: nic-switch-agent.np0005626463.localdomain#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.524 2 INFO neutron.agent.agent_extensions_manager [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] Loaded agent extensions: ['qos']#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.524 2 INFO neutron.agent.agent_extensions_manager [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] Initializing agent extension 'qos'#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.939 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] Agent initialized successfully, now running... #033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.939 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Feb 23 04:34:13 localhost neutron_sriov_agent[257694]: 2026-02-23 09:34:13.940 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-eaa4e402-ca61-4ef3-bc4c-a08552cd5290 - - - - - -] Agent out of sync with plugin!#033[00m Feb 23 04:34:15 localhost python3.9[257817]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 23 04:34:15 localhost nova_compute[231721]: 2026-02-23 09:34:15.095 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:16 localhost python3.9[257927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:16 localhost nova_compute[231721]: 2026-02-23 09:34:16.711 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:16 localhost python3.9[258017]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839255.7592146-1077-153905536064629/.source.yaml _original_basename=.ft4p9v8o follow=False checksum=b7da0e0729778df3fa2e9c064ac77c610bc22800 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:34:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:34:17 localhost podman[258035]: 2026-02-23 09:34:17.919585805 +0000 UTC m=+0.097307955 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:34:17 localhost podman[258035]: 2026-02-23 09:34:17.928121828 +0000 UTC m=+0.105843928 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:34:17 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:34:18 localhost python3.9[258150]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:34:19 localhost systemd[1]: Stopping neutron_sriov_agent container... Feb 23 04:34:19 localhost systemd[1]: libpod-8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae.scope: Deactivated successfully. Feb 23 04:34:19 localhost podman[258154]: 2026-02-23 09:34:19.106942249 +0000 UTC m=+0.074936258 container died 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 04:34:19 localhost systemd[1]: libpod-8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae.scope: Consumed 1.708s CPU time. Feb 23 04:34:19 localhost systemd[1]: tmp-crun.2E4u2c.mount: Deactivated successfully. Feb 23 04:34:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae-userdata-shm.mount: Deactivated successfully. Feb 23 04:34:19 localhost podman[258154]: 2026-02-23 09:34:19.172994103 +0000 UTC m=+0.140988102 container cleanup 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=neutron_sriov_agent, managed_by=edpm_ansible) Feb 23 04:34:19 localhost podman[258154]: neutron_sriov_agent Feb 23 04:34:19 localhost podman[258180]: 2026-02-23 09:34:19.26259455 +0000 UTC m=+0.053898465 container cleanup 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_id=neutron_sriov_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 04:34:19 localhost podman[258180]: neutron_sriov_agent Feb 23 04:34:19 localhost systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully. Feb 23 04:34:19 localhost systemd[1]: Stopped neutron_sriov_agent container. Feb 23 04:34:19 localhost systemd[1]: Starting neutron_sriov_agent container... Feb 23 04:34:19 localhost systemd[1]: Started libcrun container. Feb 23 04:34:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51924fe3f0717b37533a02cc6477f7f6558c10a39f81b159d32c6a1d48606037/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 23 04:34:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/51924fe3f0717b37533a02cc6477f7f6558c10a39f81b159d32c6a1d48606037/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:34:19 localhost podman[258192]: 2026-02-23 09:34:19.397432815 +0000 UTC m=+0.106237161 container init 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}) Feb 23 04:34:19 localhost podman[258192]: 2026-02-23 09:34:19.406234456 +0000 UTC m=+0.115038802 container start 8f2ea6310cd353f33c1478e7503e6f9c52ea7620eec6612361cd1c39bc0392ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, container_name=neutron_sriov_agent, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-b5f612145a1fc71d65d885476e8573b292a256521aff746056c2bc56d98839aa'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:34:19 localhost podman[258192]: neutron_sriov_agent Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + sudo -E kolla_set_configs Feb 23 04:34:19 localhost systemd[1]: Started neutron_sriov_agent container. Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Validating config file Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Copying service configuration files Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Writing out command to execute Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c4d69ddbf6f4a149b7e6d31d28f2dc1fe1c08d98a601f027e6d63209aefe8011 Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: ++ cat /run_command Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + CMD=/usr/bin/neutron-sriov-nic-agent Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + ARGS= Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + sudo kolla_copy_cacerts Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + [[ ! -n '' ]] Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + . kolla_extend_start Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: Running command: '/usr/bin/neutron-sriov-nic-agent' Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\''' Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + umask 0022 Feb 23 04:34:19 localhost neutron_sriov_agent[258207]: + exec /usr/bin/neutron-sriov-nic-agent Feb 23 04:34:20 localhost nova_compute[231721]: 2026-02-23 09:34:20.128 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:20 localhost systemd-logind[759]: Session 57 logged out. Waiting for processes to exit. Feb 23 04:34:20 localhost systemd[1]: session-57.scope: Deactivated successfully. Feb 23 04:34:20 localhost systemd[1]: session-57.scope: Consumed 22.603s CPU time. Feb 23 04:34:20 localhost systemd-logind[759]: Removed session 57. Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.011 2 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.011 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.012 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005626463.localdomain'}#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.013 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] RPC agent_id: nic-switch-agent.np0005626463.localdomain#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.017 2 INFO neutron.agent.agent_extensions_manager [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] Loaded agent extensions: ['qos']#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.017 2 INFO neutron.agent.agent_extensions_manager [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] Initializing agent extension 'qos'#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.143 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] Agent initialized successfully, now running... #033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.144 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] SRIOV NIC Agent RPC Daemon Started!#033[00m Feb 23 04:34:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:34:21.144 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-3d259088-55cf-41f4-aa43-82356d20e4c0 - - - - - -] Agent out of sync with plugin!#033[00m Feb 23 04:34:21 localhost nova_compute[231721]: 2026-02-23 09:34:21.746 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11435 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8F3600000000001030307) Feb 23 04:34:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11436 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8F7860000000001030307) Feb 23 04:34:25 localhost nova_compute[231721]: 2026-02-23 09:34:25.159 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47054 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8FA060000000001030307) Feb 23 04:34:26 localhost nova_compute[231721]: 2026-02-23 09:34:26.785 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:26 localhost sshd[258240]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:34:27 localhost systemd-logind[759]: New session 58 of user zuul. Feb 23 04:34:27 localhost systemd[1]: Started Session 58 of User zuul. Feb 23 04:34:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11437 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF8FF860000000001030307) Feb 23 04:34:28 localhost python3.9[258351]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:34:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49498 DF PROTO=TCP SPT=41760 DPT=9102 SEQ=1430148475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF904070000000001030307) Feb 23 04:34:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:34:29 localhost podman[258466]: 2026-02-23 09:34:29.228305706 +0000 UTC m=+0.085836037 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1770267347, io.buildah.version=1.33.7, version=9.7) Feb 23 04:34:29 localhost podman[258466]: 2026-02-23 09:34:29.245236108 +0000 UTC m=+0.102766459 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, release=1770267347, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:34:29 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:34:29 localhost python3.9[258465]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:34:30 localhost nova_compute[231721]: 2026-02-23 09:34:30.162 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:30 localhost python3.9[258547]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:34:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11438 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF90F460000000001030307) Feb 23 04:34:31 localhost nova_compute[231721]: 2026-02-23 09:34:31.826 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:34:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:34:33 localhost systemd[1]: tmp-crun.fwAAHu.mount: Deactivated successfully. Feb 23 04:34:33 localhost podman[258550]: 2026-02-23 09:34:33.922921291 +0000 UTC m=+0.094563787 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:34:33 localhost podman[258551]: 2026-02-23 09:34:33.986281889 +0000 UTC m=+0.157802761 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:34:34 localhost podman[258551]: 2026-02-23 09:34:33.999536133 +0000 UTC m=+0.171057015 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:34:34 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:34:34 localhost podman[258550]: 2026-02-23 09:34:34.048378516 +0000 UTC m=+0.220020942 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:34:34 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:34:35 localhost python3.9[258706]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Feb 23 04:34:35 localhost nova_compute[231721]: 2026-02-23 09:34:35.194 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:34:36 localhost nova_compute[231721]: 2026-02-23 09:34:36.871 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:36 localhost podman[258786]: 2026-02-23 09:34:36.933852738 +0000 UTC m=+0.107906606 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 23 04:34:36 localhost podman[258786]: 2026-02-23 09:34:36.94922639 +0000 UTC m=+0.123280268 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 04:34:36 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:34:37 localhost python3.9[258838]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/container-startup-config setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:37 localhost python3.9[258948]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:38 localhost sshd[259024]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:34:38 localhost python3.9[259060]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:38 localhost python3.9[259170]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:39 localhost podman[242954]: time="2026-02-23T09:34:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:34:39 localhost podman[242954]: @ - - [23/Feb/2026:09:34:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147333 "" "Go-http-client/1.1" Feb 23 04:34:39 localhost podman[242954]: @ - - [23/Feb/2026:09:34:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16352 "" "Go-http-client/1.1" Feb 23 04:34:39 localhost python3.9[259280]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11439 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF930060000000001030307) Feb 23 04:34:40 localhost python3.9[259390]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:40 localhost nova_compute[231721]: 2026-02-23 09:34:40.228 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:40 localhost python3.9[259500]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:41 localhost python3.9[259610]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:41 localhost nova_compute[231721]: 2026-02-23 09:34:41.906 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:34:42 localhost systemd[1]: tmp-crun.0KUEqe.mount: Deactivated successfully. Feb 23 04:34:42 localhost podman[259699]: 2026-02-23 09:34:42.292829225 +0000 UTC m=+0.086283602 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 23 04:34:42 localhost podman[259699]: 2026-02-23 09:34:42.327238201 +0000 UTC m=+0.120692609 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:34:42 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:34:42 localhost python3.9[259698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839281.0231736-273-176270633770242/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=472c5e922ae22c8bdcaef73d1ca73ce5597b440e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:43 localhost python3.9[259824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:43 localhost openstack_network_exporter[245358]: ERROR 09:34:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:34:43 localhost openstack_network_exporter[245358]: Feb 23 04:34:43 localhost openstack_network_exporter[245358]: ERROR 09:34:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:34:43 localhost openstack_network_exporter[245358]: Feb 23 04:34:43 localhost python3.9[259910]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839282.5792007-318-270059151978062/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:43 localhost sshd[260020]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:34:44 localhost python3.9[260019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:44 localhost python3.9[260107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839283.673694-318-82219066607471/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:45 localhost python3.9[260215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:45 localhost nova_compute[231721]: 2026-02-23 09:34:45.276 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:45 localhost python3.9[260301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839284.675606-318-142287225081878/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=cb6fb0641ad99e101e98bdc096471f5e2f31a05c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:46 localhost python3.9[260409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:46 localhost nova_compute[231721]: 2026-02-23 09:34:46.937 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:47 localhost python3.9[260495]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839286.3943536-492-61306009228428/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=d10c6f671263070bdc94fed977552f121764373c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:47 localhost python3.9[260603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:48 localhost python3.9[260689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839287.510439-537-72688536006185/.source follow=False _original_basename=haproxy.j2 checksum=eddfecb822bb60e7241db0fd719c7552d2d25452 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:34:48.534 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:34:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:34:48.535 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:34:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:34:48.536 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:34:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:34:48 localhost podman[260778]: 2026-02-23 09:34:48.90964953 +0000 UTC m=+0.080933657 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:34:48 localhost podman[260778]: 2026-02-23 09:34:48.917913184 +0000 UTC m=+0.089197331 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:34:48 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:34:49 localhost python3.9[260808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:49 localhost python3.9[260906]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839288.6223347-537-565126377649/.source follow=False _original_basename=dnsmasq.j2 checksum=a6b8b2fb47e7419d250eaee9e3565b13fff8f42e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:50 localhost python3.9[261014]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:50 localhost nova_compute[231721]: 2026-02-23 09:34:50.315 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:50 localhost python3.9[261069]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:51 localhost python3.9[261177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:51 localhost python3.9[261263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839290.8308282-624-65563057521069/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:51 localhost nova_compute[231721]: 2026-02-23 09:34:51.976 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:52 localhost python3.9[261371]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:34:53 localhost python3.9[261562]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:53 localhost podman[261591]: 2026-02-23 09:34:53.781230774 +0000 UTC m=+0.085955562 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, release=1770267347, distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64) Feb 23 04:34:53 localhost podman[261591]: 2026-02-23 09:34:53.897157624 +0000 UTC m=+0.201882352 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux , ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:34:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1607 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF968900000000001030307) Feb 23 04:34:54 localhost sshd[261802]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:34:54 localhost python3.9[261801]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:55 localhost python3.9[261877]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1608 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF96C860000000001030307) Feb 23 04:34:55 localhost nova_compute[231721]: 2026-02-23 09:34:55.361 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:55 localhost python3.9[262019]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11440 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF970060000000001030307) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.130 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.131 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.150 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 55800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adc72d6a-d082-41ee-a29b-2423f69e5a9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 55800000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:34:56.132006', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e9f77934-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.340275174, 'message_signature': '46c3c9c904b885d8401956e1a210f6b16501c6c05b7e8d6a9f58953d5827b628'}]}, 'timestamp': '2026-02-23 09:34:56.151659', '_unique_id': 'fa9020d37890426db489d6d4a1fc1931'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.153 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.156 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c8491be-13c9-4312-9787-aac2c47f8203', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.154704', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'e9f85c8c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '91a00fe089596d72b867cd20a1d20b97d4c614ddf705adae92364e28c1fe08af'}]}, 'timestamp': '2026-02-23 09:34:56.157455', '_unique_id': 'ea83552615cc4ec986599488f5837f41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.158 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.159 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52a8fafe-df1e-45c2-8ce1-fafc1ee93355', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.159794', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'e9f8cd2a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'dcde5105fe79e4a6b3e1eaf5c5352c4b400cf3a2da9026eccd550339061a6e28'}]}, 'timestamp': '2026-02-23 09:34:56.160320', '_unique_id': '345e726edcd546488f54236a880aadbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.161 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.162 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b06f8c7-c1e0-4e2f-8ed9-dad6d67629c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.162551', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'e9f9377e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'c8e4bfa84e008e7e9530c9429af1d47a8e54b50c4604a6a89463b0de3ef48c18'}]}, 'timestamp': '2026-02-23 09:34:56.163071', '_unique_id': 'f952036d854647e8b347a62d135b460e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.164 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:34:56 localhost python3.9[262077]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.197 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9aea7f6e-f5f0-4637-ba6c-27e7f40e07f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.165399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e9fe9750-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '3c96611165d55c822a12bc9d5003ec80689bf5bd55ddbd7f376fab9e7d1f404f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.165399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e9feabbe-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '416e9daaae04fb16c0451852af84685f0672d41e4c28f5f9275bcde8c6640e22'}]}, 'timestamp': '2026-02-23 09:34:56.198777', '_unique_id': 'b200d0f07d2a413e8e22c190a45b6b8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.202 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73d5e210-0a91-48ef-a34d-6b3c6d05acfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.201633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e9ff2f62-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '682b36a2ad1bd8bd598ca62f2ab4b61ca74174613bf1680c190d7a0067443653'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.201633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e9ff40f6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': 'f35e2be863eee16b5e109164f430d135c119729b535984482730951b1322d85f'}]}, 'timestamp': '2026-02-23 09:34:56.202581', '_unique_id': '648f54a974154782b58680c796048655'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d19a4dc-f6f8-4dbb-9f89-9be247e8ee32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.205305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e9ffbd9c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '3e45c434dbed7d7541801a695e619509e3c5a86c2a359125b5fc7b1872f52da2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.205305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e9ffd052-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': 'b741c3aa70c6fb4e4213053d2110c182dc64797f611b46e3081ee0b6abc2d5f2'}]}, 'timestamp': '2026-02-23 09:34:56.206255', '_unique_id': '0c842361abe04d4e96b8fb919eed6586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.207 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e193fd5-2dbd-492f-9c89-68d3257a9c60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.208603', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea003e8e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'f0ba8526d9f1d00442adce0a3f388a780b4549d64745e2b81595badc71e049c8'}]}, 'timestamp': '2026-02-23 09:34:56.209173', '_unique_id': '72d61fd59d3447cf8231496d15f1b2b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.211 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df6079ee-084c-4846-9c4e-9cce4dd02e83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.211443', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea00ad42-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '462196be923fa0169a7eae174d41ab92b795eedc2585593a0e2f0f623b44705e'}]}, 'timestamp': '2026-02-23 09:34:56.211972', '_unique_id': '2fce48a7a2ca452e96ad2143af2cb246'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.212 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.214 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e90603f3-96af-4b40-b8aa-c0af4d413cb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9216, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.214248', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea011aca-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'd9a9e30b09db5b5daf32d195d74c86a4fc708bb36602b59907101d6387a1dcf6'}]}, 'timestamp': '2026-02-23 09:34:56.214745', '_unique_id': '4c47e6387cda4d29acabd13fe610e61b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.215 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '01468b17-e61c-42a1-9ab0-70377e217073', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.217548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea019bb2-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '814b3d2e210177cef4c5e26e9dc2e27c525cf0dfec95a24975b79c7169c7200b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.217548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea01ae5e-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '49d6354e987d0bada8fded9803d0770bce939104ce3dfc0d08a8d4d4acf43f28'}]}, 'timestamp': '2026-02-23 09:34:56.218494', '_unique_id': 'a4a86eb6dd564fa2bc534dff2cbd9e8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '427303af-2c21-46d4-9c77-0a7b8f1b2c31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.220756', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea021ba0-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '8d37f606d015c31749718bb6f9f35c465eaafbcc0808be3467cedd50bbe45591'}]}, 'timestamp': '2026-02-23 09:34:56.221320', '_unique_id': '34189cd17bdb4795b7be01ac3e13bb38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '800c2297-7595-48ed-bf71-b8cb7af2a64f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:34:56.223520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ea0284b4-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.340275174, 'message_signature': 'fe3d294713f1b9ebcb1d1b82c0bb8288efa42172cdcc5696317a081171cf7cc7'}]}, 'timestamp': '2026-02-23 09:34:56.224044', '_unique_id': '06657e34ea1846988925282f76d60184'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.226 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2509cf08-d092-43dc-aaf6-f2ebf654c724', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.226428', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea02f656-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '67b7a2106eb29cf9b06fcb635174ba201ff977722dc28b5e999d8338e472d6f4'}]}, 'timestamp': '2026-02-23 09:34:56.226951', '_unique_id': 'c8b9f3fd642e4b77af7166bff6412f01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1d17d67-692a-41b4-b157-caeb5fd7d07a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.229186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea051d0a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': 'f5340bfc36bc0fd585598d311a74a7a6fb70910318f1054121483e7bb69e5e95'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.229186', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea052980-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': 'badb59891334d73b82d573ea992639b614302e99b1350cca1efcf768dbc2bb5d'}]}, 'timestamp': '2026-02-23 09:34:56.241217', '_unique_id': '441e7112ccd34d5d98790c9a2878bd7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca41c10c-87f8-484f-b7eb-0854c52abe00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.242656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea056c6a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '531bba6cd0ca96f5def37bd38c05e4520a543c883be8b458c1d891c3d967fe0c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.242656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea057796-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '0bafd189d51526a1ba82ac0b686ff0408639be521d3e80ccc93a0d6a19bc2a8a'}]}, 'timestamp': '2026-02-23 09:34:56.243213', '_unique_id': '568c42c9c28e478b92f6389c52cb31a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49e0ed73-b464-48d4-bade-5a8c76d99dcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.244611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea05b85a-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': 'bccc1fad628d016cb1e99b1a461d4b2a77e5eca6718ede932ae815260ed03a39'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.244611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea05c3cc-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': '032df660b0a71ed0d1725fe82df0f873acd44155988eebfafff607f9603c157e'}]}, 'timestamp': '2026-02-23 09:34:56.245164', '_unique_id': '73853cad79ef4949899837acae2bf228'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '220027b5-b5e9-4298-b7a6-00a7cfd6977c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.246472', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea060148-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': 'c1b1184af8a56c8b061abc614ccd34f593b024b137fda56673c74ffc1a027f9d'}]}, 'timestamp': '2026-02-23 09:34:56.246755', '_unique_id': '798dde2eb9aa448fbaf1138c8dd4b0cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53c6fe9b-554d-4c28-b317-409f4852a9e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.248056', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea063f00-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': '9716e99ec14b09d009ad5f3a5f66d37c1c69b54b9abc404280528dc3f87214d8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.248056', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea06489c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.354886762, 'message_signature': 'f0038ed9b882beb0155675ba0dff0cbc1c548efba7711e8d93da2dafff7b56b9'}]}, 'timestamp': '2026-02-23 09:34:56.248561', '_unique_id': '414bd1202e1b4d23acb9e6586b4b614b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cfbee12-84d5-4d4f-bbd4-b932816d1a0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:34:56.250058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ea069324-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': '9570e4604484cf4cf85e2b5dd500fe534c16032b8e31022354f10e28a6402f73'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:34:56.250058', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ea069d4c-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.418696643, 'message_signature': 'e9b6ec69729fd7f1234a7ee677dcc4ab3f67c97b07d549af26de9711e9a9f275'}]}, 'timestamp': '2026-02-23 09:34:56.250733', '_unique_id': '5c5ddf8711e44c08a34188f244626d4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9a92e77-d891-4aa3-b39b-aade3e2bd94f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:34:56.252139', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'ea06dea6-109a-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10736.344198894, 'message_signature': '76a987dcc1cecdac7c68a68adf4833f335296709573f0d4229fd05978677584c'}]}, 'timestamp': '2026-02-23 09:34:56.252423', '_unique_id': 'ed4b0e4673294ac7a8019f9bcd0a79af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:34:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:34:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:34:56 localhost python3.9[262187]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:34:57 localhost nova_compute[231721]: 2026-02-23 09:34:57.018 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:34:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1609 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF974860000000001030307) Feb 23 04:34:57 localhost python3.9[262297]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:57 localhost nova_compute[231721]: 2026-02-23 09:34:57.840 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:34:57 localhost nova_compute[231721]: 2026-02-23 09:34:57.841 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:34:57 localhost nova_compute[231721]: 2026-02-23 09:34:57.841 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:34:57 localhost python3.9[262354]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:34:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47055 DF PROTO=TCP SPT=40542 DPT=9102 SEQ=3702321504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF978060000000001030307) Feb 23 04:34:58 localhost nova_compute[231721]: 2026-02-23 09:34:58.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:34:58 localhost nova_compute[231721]: 2026-02-23 09:34:58.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:34:58 localhost nova_compute[231721]: 2026-02-23 09:34:58.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:34:59 localhost python3.9[262464]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:34:59 localhost nova_compute[231721]: 2026-02-23 09:34:59.356 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:34:59 localhost nova_compute[231721]: 2026-02-23 09:34:59.356 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:34:59 localhost nova_compute[231721]: 2026-02-23 09:34:59.356 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:34:59 localhost nova_compute[231721]: 2026-02-23 09:34:59.357 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:34:59 localhost systemd[1]: tmp-crun.f7p3VO.mount: Deactivated successfully. Feb 23 04:34:59 localhost podman[262522]: 2026-02-23 09:34:59.443386243 +0000 UTC m=+0.089033447 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal) Feb 23 04:34:59 localhost podman[262522]: 2026-02-23 09:34:59.457038042 +0000 UTC m=+0.102685216 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, version=9.7, maintainer=Red Hat, Inc.) Feb 23 04:34:59 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:34:59 localhost python3.9[262521]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:00 localhost nova_compute[231721]: 2026-02-23 09:35:00.361 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:00 localhost nova_compute[231721]: 2026-02-23 09:35:00.456 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:35:00 localhost nova_compute[231721]: 2026-02-23 09:35:00.478 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:35:00 localhost nova_compute[231721]: 2026-02-23 09:35:00.479 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:35:00 localhost nova_compute[231721]: 2026-02-23 09:35:00.480 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:00 localhost nova_compute[231721]: 2026-02-23 09:35:00.480 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:00 localhost nova_compute[231721]: 2026-02-23 09:35:00.480 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:01 localhost python3.9[262651]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:35:01 localhost systemd[1]: Reloading. Feb 23 04:35:01 localhost systemd-rc-local-generator[262679]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:35:01 localhost systemd-sysv-generator[262682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:35:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1610 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF984460000000001030307) Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:01 localhost nova_compute[231721]: 2026-02-23 09:35:01.476 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:01 localhost nova_compute[231721]: 2026-02-23 09:35:01.539 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:02 localhost python3.9[262799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:35:02 localhost nova_compute[231721]: 2026-02-23 09:35:02.068 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:02 localhost python3.9[262856]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:03 localhost python3.9[262966]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:35:03 localhost nova_compute[231721]: 2026-02-23 09:35:03.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:03 localhost nova_compute[231721]: 2026-02-23 09:35:03.566 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:35:03 localhost nova_compute[231721]: 2026-02-23 09:35:03.567 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:35:03 localhost nova_compute[231721]: 2026-02-23 09:35:03.567 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:35:03 localhost nova_compute[231721]: 2026-02-23 09:35:03.568 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:35:03 localhost nova_compute[231721]: 2026-02-23 09:35:03.568 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:35:03 localhost python3.9[263023]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.016 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.088 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.089 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:35:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:35:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.326 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.328 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12244MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.328 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.329 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:35:04 localhost systemd[1]: tmp-crun.L6KTKV.mount: Deactivated successfully. Feb 23 04:35:04 localhost podman[263155]: 2026-02-23 09:35:04.393046886 +0000 UTC m=+0.094643178 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.413 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.414 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.414 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.456 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:35:04 localhost podman[263157]: 2026-02-23 09:35:04.464089608 +0000 UTC m=+0.164984439 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:35:04 localhost podman[263157]: 2026-02-23 09:35:04.476208811 +0000 UTC m=+0.177103632 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:35:04 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:35:04 localhost podman[263155]: 2026-02-23 09:35:04.499328321 +0000 UTC m=+0.200924573 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:35:04 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:35:04 localhost python3.9[263156]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:35:04 localhost systemd[1]: Reloading. Feb 23 04:35:04 localhost systemd-rc-local-generator[263249]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:35:04 localhost systemd-sysv-generator[263252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:04 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:04 localhost systemd[1]: Starting Create netns directory... Feb 23 04:35:04 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Feb 23 04:35:04 localhost systemd[1]: Finished Create netns directory. Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.957 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.964 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.982 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.985 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:35:04 localhost nova_compute[231721]: 2026-02-23 09:35:04.986 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:35:05 localhost nova_compute[231721]: 2026-02-23 09:35:05.363 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:05 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Feb 23 04:35:06 localhost python3.9[263375]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:07 localhost nova_compute[231721]: 2026-02-23 09:35:07.111 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:35:07 localhost systemd[1]: tmp-crun.MlsUM6.mount: Deactivated successfully. Feb 23 04:35:07 localhost podman[263486]: 2026-02-23 09:35:07.488691838 +0000 UTC m=+0.097180517 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:35:07 localhost podman[263486]: 2026-02-23 09:35:07.502278345 +0000 UTC m=+0.110767024 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:35:07 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:35:07 localhost python3.9[263485]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:35:08 localhost python3.9[263613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:35:08 localhost python3.9[263701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839307.8117497-1092-18518264836399/.source.json _original_basename=.w8nqsbxi follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1611 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9A4060000000001030307) Feb 23 04:35:09 localhost podman[242954]: time="2026-02-23T09:35:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:35:09 localhost podman[242954]: @ - - [23/Feb/2026:09:35:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147333 "" "Go-http-client/1.1" Feb 23 04:35:09 localhost podman[242954]: @ - - [23/Feb/2026:09:35:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16348 "" "Go-http-client/1.1" Feb 23 04:35:09 localhost python3.9[263809]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:10 localhost nova_compute[231721]: 2026-02-23 09:35:10.382 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:12 localhost nova_compute[231721]: 2026-02-23 09:35:12.162 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:35:12 localhost podman[264113]: 2026-02-23 09:35:12.90872267 +0000 UTC m=+0.083973521 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0) Feb 23 04:35:12 localhost podman[264113]: 2026-02-23 09:35:12.91491262 +0000 UTC m=+0.090163501 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Feb 23 04:35:12 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:35:13 localhost python3.9[264119]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False Feb 23 04:35:13 localhost openstack_network_exporter[245358]: ERROR 09:35:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:35:13 localhost openstack_network_exporter[245358]: Feb 23 04:35:13 localhost openstack_network_exporter[245358]: ERROR 09:35:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:35:13 localhost openstack_network_exporter[245358]: Feb 23 04:35:15 localhost python3.9[264240]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 23 04:35:15 localhost nova_compute[231721]: 2026-02-23 09:35:15.416 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:16 localhost sshd[264351]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:35:16 localhost python3[264350]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json containers=['neutron_dhcp_agent'] log_base_path=/var/log/containers/stdouts debug=False Feb 23 04:35:16 localhost podman[264389]: Feb 23 04:35:16 localhost podman[264389]: 2026-02-23 09:35:16.654860193 +0000 UTC m=+0.083482526 container create 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_dhcp, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 23 04:35:16 localhost podman[264389]: 2026-02-23 09:35:16.610531011 +0000 UTC m=+0.039153374 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:35:16 localhost python3[264350]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:35:17 localhost nova_compute[231721]: 2026-02-23 09:35:17.208 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:17 localhost python3.9[264537]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:35:18 localhost python3.9[264649]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:18 localhost sshd[264705]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:35:18 localhost python3.9[264704]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:35:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:35:19 localhost systemd[1]: tmp-crun.GIdblx.mount: Deactivated successfully. Feb 23 04:35:19 localhost podman[264761]: 2026-02-23 09:35:19.133414569 +0000 UTC m=+0.092620816 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:35:19 localhost podman[264761]: 2026-02-23 09:35:19.168329201 +0000 UTC m=+0.127535468 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:35:19 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:35:19 localhost python3.9[264838]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771839318.806457-1326-216739614844479/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:19 localhost python3.9[264893]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:35:20 localhost systemd[1]: Reloading. Feb 23 04:35:20 localhost systemd-rc-local-generator[264919]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:35:20 localhost systemd-sysv-generator[264922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:20 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:20 localhost nova_compute[231721]: 2026-02-23 09:35:20.448 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:20 localhost sshd[264985]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:35:21 localhost python3.9[264984]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:35:21 localhost systemd[1]: Reloading. Feb 23 04:35:21 localhost systemd-rc-local-generator[265011]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:35:21 localhost systemd-sysv-generator[265017]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:21 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:35:21 localhost systemd[1]: Starting neutron_dhcp_agent container... Feb 23 04:35:21 localhost systemd[1]: Started libcrun container. Feb 23 04:35:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 23 04:35:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:35:21 localhost podman[265027]: 2026-02-23 09:35:21.589719672 +0000 UTC m=+0.130415948 container init 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=neutron_dhcp) Feb 23 04:35:21 localhost podman[265027]: 2026-02-23 09:35:21.605755924 +0000 UTC m=+0.146452210 container start 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=neutron_dhcp, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0) Feb 23 04:35:21 localhost podman[265027]: neutron_dhcp_agent Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + sudo -E kolla_set_configs Feb 23 04:35:21 localhost systemd[1]: Started neutron_dhcp_agent container. Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Validating config file Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Copying service configuration files Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Writing out command to execute Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c4d69ddbf6f4a149b7e6d31d28f2dc1fe1c08d98a601f027e6d63209aefe8011 Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: ++ cat /run_command Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + CMD=/usr/bin/neutron-dhcp-agent Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + ARGS= Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + sudo kolla_copy_cacerts Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + [[ ! -n '' ]] Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + . kolla_extend_start Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: Running command: '/usr/bin/neutron-dhcp-agent' Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + umask 0022 Feb 23 04:35:21 localhost neutron_dhcp_agent[265040]: + exec /usr/bin/neutron-dhcp-agent Feb 23 04:35:22 localhost nova_compute[231721]: 2026-02-23 09:35:22.247 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:22 localhost python3.9[265164]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 23 04:35:22 localhost systemd[1]: tmp-crun.VRZ4Vs.mount: Deactivated successfully. Feb 23 04:35:22 localhost neutron_dhcp_agent[265040]: 2026-02-23 09:35:22.879 265044 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 23 04:35:22 localhost neutron_dhcp_agent[265040]: 2026-02-23 09:35:22.879 265044 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44#033[00m Feb 23 04:35:23 localhost neutron_dhcp_agent[265040]: 2026-02-23 09:35:23.241 265044 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 23 04:35:23 localhost neutron_dhcp_agent[265040]: 2026-02-23 09:35:23.472 265044 INFO neutron.agent.dhcp.agent [None req-2b0d4c40-8213-4918-933a-7a481075b885 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 23 04:35:23 localhost neutron_dhcp_agent[265040]: 2026-02-23 09:35:23.472 265044 INFO neutron.agent.dhcp.agent [None req-2b0d4c40-8213-4918-933a-7a481075b885 - - - - - -] Synchronizing state complete#033[00m Feb 23 04:35:23 localhost neutron_dhcp_agent[265040]: 2026-02-23 09:35:23.546 265044 INFO neutron.agent.dhcp.agent [None req-2b0d4c40-8213-4918-933a-7a481075b885 - - - - - -] DHCP agent started#033[00m Feb 23 04:35:23 localhost python3.9[265275]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:35:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24706 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9DDC00000000001030307) Feb 23 04:35:24 localhost python3.9[265365]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839323.1285071-1461-223820587238363/.source.yaml _original_basename=.v8c6l_mw follow=False checksum=032f1f7e8199faa0c01f5405a803b0de94087c3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:35:24 localhost ovn_metadata_agent[163567]: 2026-02-23 09:35:24.288 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:35:24 localhost ovn_metadata_agent[163567]: 2026-02-23 09:35:24.289 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:35:24 localhost ovn_metadata_agent[163567]: 2026-02-23 09:35:24.291 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:35:24 localhost nova_compute[231721]: 2026-02-23 09:35:24.325 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:24 localhost python3.9[265475]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:35:25 localhost systemd[1]: Stopping neutron_dhcp_agent container... Feb 23 04:35:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24707 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9E1C60000000001030307) Feb 23 04:35:25 localhost nova_compute[231721]: 2026-02-23 09:35:25.486 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:25 localhost neutron_dhcp_agent[265040]: 2026-02-23 09:35:25.620 265044 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Feb 23 04:35:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1612 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9E4060000000001030307) Feb 23 04:35:25 localhost systemd[1]: libpod-1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8.scope: Deactivated successfully. Feb 23 04:35:25 localhost systemd[1]: libpod-1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8.scope: Consumed 1.973s CPU time. Feb 23 04:35:25 localhost podman[265479]: 2026-02-23 09:35:25.936937598 +0000 UTC m=+0.890465724 container died 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_dhcp, org.label-schema.license=GPLv2, tcib_managed=true, container_name=neutron_dhcp_agent) Feb 23 04:35:25 localhost podman[265479]: 2026-02-23 09:35:25.988477312 +0000 UTC m=+0.942005338 container cleanup 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=neutron_dhcp_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}) Feb 23 04:35:25 localhost podman[265479]: neutron_dhcp_agent Feb 23 04:35:26 localhost podman[265519]: error opening file `/run/crun/1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8/status`: No such file or directory Feb 23 04:35:26 localhost podman[265507]: 2026-02-23 09:35:26.086921406 +0000 UTC m=+0.068602839 container cleanup 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, config_id=neutron_dhcp, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:35:26 localhost podman[265507]: neutron_dhcp_agent Feb 23 04:35:26 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Feb 23 04:35:26 localhost systemd[1]: Stopped neutron_dhcp_agent container. Feb 23 04:35:26 localhost systemd[1]: Starting neutron_dhcp_agent container... Feb 23 04:35:26 localhost systemd[1]: var-lib-containers-storage-overlay-e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c-merged.mount: Deactivated successfully. Feb 23 04:35:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8-userdata-shm.mount: Deactivated successfully. Feb 23 04:35:26 localhost systemd[1]: Started libcrun container. Feb 23 04:35:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Feb 23 04:35:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f78b67cd40916655edbb88a4af3cf02793db12429e4ca7ee55e69e47b3b14c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:35:26 localhost podman[265521]: 2026-02-23 09:35:26.24233993 +0000 UTC m=+0.121380589 container init 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=neutron_dhcp, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:35:26 localhost podman[265521]: 2026-02-23 09:35:26.251096579 +0000 UTC m=+0.130137238 container start 1b2c23eafd230afc8d091ee7b7cedbb3afbedca0796f62e8cc47ca513d5981f8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-1f52e72ae9640a6a5ab74adc25dc50b25fb3048e41c16c538b8ad4759be27c31'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:35:26 localhost podman[265521]: neutron_dhcp_agent Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + sudo -E kolla_set_configs Feb 23 04:35:26 localhost systemd[1]: Started neutron_dhcp_agent container. Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Validating config file Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Copying service configuration files Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Writing out command to execute Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/external Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29 Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/c4d69ddbf6f4a149b7e6d31d28f2dc1fe1c08d98a601f027e6d63209aefe8011 Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.pid.haproxy Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/9da5b53d-3184-450f-9a5b-bdba1a6c9f6d.conf Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: ++ cat /run_command Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + CMD=/usr/bin/neutron-dhcp-agent Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + ARGS= Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + sudo kolla_copy_cacerts Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + [[ ! -n '' ]] Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + . kolla_extend_start Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: Running command: '/usr/bin/neutron-dhcp-agent' Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + umask 0022 Feb 23 04:35:26 localhost neutron_dhcp_agent[265537]: + exec /usr/bin/neutron-dhcp-agent Feb 23 04:35:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24708 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9E9C60000000001030307) Feb 23 04:35:27 localhost systemd-logind[759]: Session 58 logged out. Waiting for processes to exit. Feb 23 04:35:27 localhost systemd[1]: session-58.scope: Deactivated successfully. Feb 23 04:35:27 localhost systemd[1]: session-58.scope: Consumed 34.994s CPU time. Feb 23 04:35:27 localhost systemd-logind[759]: Removed session 58. Feb 23 04:35:27 localhost nova_compute[231721]: 2026-02-23 09:35:27.294 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:27 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:35:27.484 265541 INFO neutron.common.config [-] Logging enabled!#033[00m Feb 23 04:35:27 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:35:27.484 265541 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44#033[00m Feb 23 04:35:27 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:35:27.847 265541 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 23 04:35:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11441 DF PROTO=TCP SPT=54726 DPT=9102 SEQ=3837831284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9EE060000000001030307) Feb 23 04:35:28 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:35:28.443 265541 INFO neutron.agent.dhcp.agent [None req-99d25339-5e44-4afc-804a-4e9abec353a2 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 23 04:35:28 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:35:28.443 265541 INFO neutron.agent.dhcp.agent [None req-99d25339-5e44-4afc-804a-4e9abec353a2 - - - - - -] Synchronizing state complete#033[00m Feb 23 04:35:28 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:35:28.471 265541 INFO neutron.agent.dhcp.agent [None req-99d25339-5e44-4afc-804a-4e9abec353a2 - - - - - -] DHCP agent started#033[00m Feb 23 04:35:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:35:29 localhost podman[265570]: 2026-02-23 09:35:29.917968818 +0000 UTC m=+0.088281862 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 04:35:29 localhost podman[265570]: 2026-02-23 09:35:29.957319487 +0000 UTC m=+0.127632511 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, version=9.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 04:35:29 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:35:30 localhost nova_compute[231721]: 2026-02-23 09:35:30.488 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24709 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BF9F9860000000001030307) Feb 23 04:35:32 localhost nova_compute[231721]: 2026-02-23 09:35:32.325 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:35:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:35:34 localhost systemd[1]: tmp-crun.1D7M7H.mount: Deactivated successfully. Feb 23 04:35:34 localhost podman[265592]: 2026-02-23 09:35:34.902285957 +0000 UTC m=+0.073554582 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:35:34 localhost podman[265592]: 2026-02-23 09:35:34.915385278 +0000 UTC m=+0.086653913 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:35:34 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:35:34 localhost podman[265591]: 2026-02-23 09:35:34.967916312 +0000 UTC m=+0.140088344 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller) Feb 23 04:35:35 localhost podman[265591]: 2026-02-23 09:35:35.004218207 +0000 UTC m=+0.176390239 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, config_id=ovn_controller, managed_by=edpm_ansible) Feb 23 04:35:35 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:35:35 localhost nova_compute[231721]: 2026-02-23 09:35:35.492 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:37 localhost nova_compute[231721]: 2026-02-23 09:35:37.372 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:35:37 localhost podman[265639]: 2026-02-23 09:35:37.899766401 +0000 UTC m=+0.078017277 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Feb 23 04:35:37 localhost podman[265639]: 2026-02-23 09:35:37.911837632 +0000 UTC m=+0.090088508 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 04:35:37 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:35:39 localhost podman[242954]: time="2026-02-23T09:35:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:35:39 localhost podman[242954]: @ - - [23/Feb/2026:09:35:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1" Feb 23 04:35:39 localhost podman[242954]: @ - - [23/Feb/2026:09:35:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16791 "" "Go-http-client/1.1" Feb 23 04:35:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24710 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA1A060000000001030307) Feb 23 04:35:40 localhost nova_compute[231721]: 2026-02-23 09:35:40.495 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:42 localhost nova_compute[231721]: 2026-02-23 09:35:42.419 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:43 localhost openstack_network_exporter[245358]: ERROR 09:35:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:35:43 localhost openstack_network_exporter[245358]: Feb 23 04:35:43 localhost openstack_network_exporter[245358]: ERROR 09:35:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:35:43 localhost openstack_network_exporter[245358]: Feb 23 04:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:35:43 localhost podman[265657]: 2026-02-23 09:35:43.904020519 +0000 UTC m=+0.081642468 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible, tcib_managed=true) Feb 23 04:35:43 localhost podman[265657]: 2026-02-23 09:35:43.937237355 +0000 UTC m=+0.114859264 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true) Feb 23 04:35:43 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:35:45 localhost nova_compute[231721]: 2026-02-23 09:35:45.511 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:47 localhost nova_compute[231721]: 2026-02-23 09:35:47.453 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:35:48.535 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:35:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:35:48.536 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:35:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:35:48.537 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:35:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:35:49 localhost podman[265675]: 2026-02-23 09:35:49.902516802 +0000 UTC m=+0.078417299 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:35:49 localhost podman[265675]: 2026-02-23 09:35:49.91421779 +0000 UTC m=+0.090118287 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:35:49 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:35:50 localhost nova_compute[231721]: 2026-02-23 09:35:50.536 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:52 localhost nova_compute[231721]: 2026-02-23 09:35:52.507 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55599 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA52F00000000001030307) Feb 23 04:35:54 localhost ovn_controller[157695]: 2026-02-23T09:35:54Z|00051|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory Feb 23 04:35:54 localhost sshd[265698]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:35:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55600 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA57060000000001030307) Feb 23 04:35:55 localhost nova_compute[231721]: 2026-02-23 09:35:55.575 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24711 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA5A060000000001030307) Feb 23 04:35:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55601 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA5F060000000001030307) Feb 23 04:35:57 localhost sshd[265786]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:35:57 localhost nova_compute[231721]: 2026-02-23 09:35:57.542 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:35:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1613 DF PROTO=TCP SPT=45600 DPT=9102 SEQ=1071701713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA62060000000001030307) Feb 23 04:35:57 localhost nova_compute[231721]: 2026-02-23 09:35:57.987 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:57 localhost nova_compute[231721]: 2026-02-23 09:35:57.987 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:57 localhost nova_compute[231721]: 2026-02-23 09:35:57.988 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:35:59 localhost nova_compute[231721]: 2026-02-23 09:35:59.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:35:59 localhost nova_compute[231721]: 2026-02-23 09:35:59.542 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:35:59 localhost nova_compute[231721]: 2026-02-23 09:35:59.542 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:36:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:36:00 localhost podman[265788]: 2026-02-23 09:36:00.196153001 +0000 UTC m=+0.088141496 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7) Feb 23 04:36:00 localhost podman[265788]: 2026-02-23 09:36:00.208678634 +0000 UTC m=+0.100667119 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, distribution-scope=public, maintainer=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, release=1770267347, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vcs-type=git) Feb 23 04:36:00 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:36:00 localhost nova_compute[231721]: 2026-02-23 09:36:00.365 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:36:00 localhost nova_compute[231721]: 2026-02-23 09:36:00.365 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:36:00 localhost nova_compute[231721]: 2026-02-23 09:36:00.365 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:36:00 localhost nova_compute[231721]: 2026-02-23 09:36:00.366 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:36:00 localhost nova_compute[231721]: 2026-02-23 09:36:00.579 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55602 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA6EC60000000001030307) Feb 23 04:36:01 localhost nova_compute[231721]: 2026-02-23 09:36:01.594 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:36:01 localhost nova_compute[231721]: 2026-02-23 09:36:01.615 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:36:01 localhost nova_compute[231721]: 2026-02-23 09:36:01.616 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:36:01 localhost nova_compute[231721]: 2026-02-23 09:36:01.616 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:01 localhost nova_compute[231721]: 2026-02-23 09:36:01.617 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:01 localhost nova_compute[231721]: 2026-02-23 09:36:01.617 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:02 localhost nova_compute[231721]: 2026-02-23 09:36:02.578 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:02 localhost nova_compute[231721]: 2026-02-23 09:36:02.612 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:02 localhost nova_compute[231721]: 2026-02-23 09:36:02.612 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:03 localhost nova_compute[231721]: 2026-02-23 09:36:03.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:05 localhost nova_compute[231721]: 2026-02-23 09:36:05.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:05 localhost nova_compute[231721]: 2026-02-23 09:36:05.562 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:36:05 localhost nova_compute[231721]: 2026-02-23 09:36:05.563 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:36:05 localhost nova_compute[231721]: 2026-02-23 09:36:05.563 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:36:05 localhost nova_compute[231721]: 2026-02-23 09:36:05.563 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:36:05 localhost nova_compute[231721]: 2026-02-23 09:36:05.564 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:36:05 localhost nova_compute[231721]: 2026-02-23 09:36:05.625 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:36:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:36:05 localhost podman[265828]: 2026-02-23 09:36:05.909665841 +0000 UTC m=+0.074131547 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:36:05 localhost podman[265829]: 2026-02-23 09:36:05.967583813 +0000 UTC m=+0.128346616 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:36:05 localhost podman[265828]: 2026-02-23 09:36:05.975378602 +0000 UTC m=+0.139844388 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:36:05 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.029 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:36:06 localhost podman[265829]: 2026-02-23 09:36:06.030310021 +0000 UTC m=+0.191072824 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:36:06 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.095 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.096 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.317 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.319 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12160MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.319 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.320 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.402 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.403 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.403 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.439 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.892 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.899 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.917 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.920 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:36:06 localhost nova_compute[231721]: 2026-02-23 09:36:06.921 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:36:07 localhost nova_compute[231721]: 2026-02-23 09:36:07.623 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:36:08 localhost podman[265900]: 2026-02-23 09:36:08.911638138 +0000 UTC m=+0.083903617 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:36:08 localhost podman[265900]: 2026-02-23 09:36:08.921334584 +0000 UTC m=+0.093600073 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0) Feb 23 04:36:08 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:36:09 localhost podman[242954]: time="2026-02-23T09:36:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:36:09 localhost podman[242954]: @ - - [23/Feb/2026:09:36:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1" Feb 23 04:36:09 localhost podman[242954]: @ - - [23/Feb/2026:09:36:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1" Feb 23 04:36:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55603 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFA90060000000001030307) Feb 23 04:36:10 localhost nova_compute[231721]: 2026-02-23 09:36:10.657 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:12 localhost nova_compute[231721]: 2026-02-23 09:36:12.665 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:13 localhost openstack_network_exporter[245358]: ERROR 09:36:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:36:13 localhost openstack_network_exporter[245358]: Feb 23 04:36:13 localhost openstack_network_exporter[245358]: ERROR 09:36:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:36:13 localhost openstack_network_exporter[245358]: Feb 23 04:36:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:36:14 localhost podman[265919]: 2026-02-23 09:36:14.906788232 +0000 UTC m=+0.082364570 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent) Feb 23 04:36:14 localhost podman[265919]: 2026-02-23 09:36:14.916240021 +0000 UTC m=+0.091816349 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 23 04:36:14 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:36:15 localhost nova_compute[231721]: 2026-02-23 09:36:15.704 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:17 localhost nova_compute[231721]: 2026-02-23 09:36:17.705 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:17 localhost sshd[265937]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:36:19 localhost sshd[265939]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:36:19 localhost systemd-logind[759]: New session 59 of user zuul. Feb 23 04:36:19 localhost systemd[1]: Started Session 59 of User zuul. Feb 23 04:36:20 localhost python3.9[266050]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:36:20 localhost nova_compute[231721]: 2026-02-23 09:36:20.745 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:36:20 localhost systemd[1]: tmp-crun.G2FYdZ.mount: Deactivated successfully. Feb 23 04:36:20 localhost podman[266055]: 2026-02-23 09:36:20.929018984 +0000 UTC m=+0.091867290 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:36:20 localhost podman[266055]: 2026-02-23 09:36:20.938230436 +0000 UTC m=+0.101078752 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:36:20 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:36:22 localhost python3.9[266185]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:36:22 localhost network[266202]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:36:22 localhost network[266203]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:36:22 localhost network[266204]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:36:22 localhost nova_compute[231721]: 2026-02-23 09:36:22.755 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:36:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54901 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAC8200000000001030307) Feb 23 04:36:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54902 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFACC460000000001030307) Feb 23 04:36:25 localhost nova_compute[231721]: 2026-02-23 09:36:25.771 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55604 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAD0060000000001030307) Feb 23 04:36:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54903 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAD4460000000001030307) Feb 23 04:36:27 localhost nova_compute[231721]: 2026-02-23 09:36:27.803 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24712 DF PROTO=TCP SPT=42036 DPT=9102 SEQ=2437173652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAD8060000000001030307) Feb 23 04:36:28 localhost python3.9[266436]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Feb 23 04:36:29 localhost python3.9[266499]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:36:30 localhost nova_compute[231721]: 2026-02-23 09:36:30.810 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:36:30 localhost podman[266502]: 2026-02-23 09:36:30.894038185 +0000 UTC m=+0.066485134 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public) Feb 23 04:36:30 localhost podman[266502]: 2026-02-23 09:36:30.906199847 +0000 UTC m=+0.078646826 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, version=9.7, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:36:30 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:36:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54904 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFAE4060000000001030307) Feb 23 04:36:32 localhost nova_compute[231721]: 2026-02-23 09:36:32.837 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:33 localhost sshd[266632]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:36:33 localhost python3.9[266631]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:36:33 localhost sshd[266651]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:36:34 localhost python3.9[266744]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:36:35 localhost python3.9[266856]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:36:35 localhost nova_compute[231721]: 2026-02-23 09:36:35.844 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:36:36 localhost systemd[1]: tmp-crun.9azg4N.mount: Deactivated successfully. Feb 23 04:36:36 localhost podman[266896]: 2026-02-23 09:36:36.125224225 +0000 UTC m=+0.083772343 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:36:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:36:36 localhost podman[266896]: 2026-02-23 09:36:36.187021295 +0000 UTC m=+0.145569383 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:36:36 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:36:36 localhost podman[266932]: 2026-02-23 09:36:36.273279173 +0000 UTC m=+0.127890202 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:36:36 localhost podman[266932]: 2026-02-23 09:36:36.306939102 +0000 UTC m=+0.161550111 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:36:36 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:36:36 localhost python3.9[267014]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:36:37 localhost nova_compute[231721]: 2026-02-23 09:36:37.896 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:37 localhost python3.9[267124]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:36:38 localhost python3.9[267236]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:36:39 localhost podman[242954]: time="2026-02-23T09:36:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:36:39 localhost podman[242954]: @ - - [23/Feb/2026:09:36:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1" Feb 23 04:36:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54905 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB04070000000001030307) Feb 23 04:36:39 localhost podman[242954]: @ - - [23/Feb/2026:09:36:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16788 "" "Go-http-client/1.1" Feb 23 04:36:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:36:39 localhost podman[267347]: 2026-02-23 09:36:39.908818894 +0000 UTC m=+0.080216484 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute) Feb 23 04:36:39 localhost podman[267347]: 2026-02-23 09:36:39.916172099 +0000 UTC m=+0.087569739 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:36:39 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:36:40 localhost python3.9[267346]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:36:40 localhost network[267383]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:36:40 localhost network[267384]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:36:40 localhost network[267385]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:36:40 localhost nova_compute[231721]: 2026-02-23 09:36:40.885 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:36:42 localhost nova_compute[231721]: 2026-02-23 09:36:42.899 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:43 localhost openstack_network_exporter[245358]: ERROR 09:36:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:36:43 localhost openstack_network_exporter[245358]: Feb 23 04:36:43 localhost openstack_network_exporter[245358]: ERROR 09:36:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:36:43 localhost openstack_network_exporter[245358]: Feb 23 04:36:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:36:45 localhost podman[267617]: 2026-02-23 09:36:45.802415741 +0000 UTC m=+0.081099612 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:36:45 localhost podman[267617]: 2026-02-23 09:36:45.807834365 +0000 UTC m=+0.086518236 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:36:45 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:36:45 localhost nova_compute[231721]: 2026-02-23 09:36:45.921 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:45 localhost python3.9[267618]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:36:47 localhost nova_compute[231721]: 2026-02-23 09:36:47.938 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:36:48.537 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:36:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:36:48.537 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:36:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:36:48.539 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:36:50 localhost python3.9[267748]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 23 04:36:50 localhost python3.9[267858]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Feb 23 04:36:50 localhost nova_compute[231721]: 2026-02-23 09:36:50.954 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:36:51 localhost podman[267876]: 2026-02-23 09:36:51.908509876 +0000 UTC m=+0.083287389 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:36:51 localhost podman[267876]: 2026-02-23 09:36:51.945217748 +0000 UTC m=+0.119995251 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:36:51 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:36:52 localhost python3.9[267989]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:36:52 localhost nova_compute[231721]: 2026-02-23 09:36:52.980 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:52 localhost python3.9[268046]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:36:53 localhost python3.9[268156]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:36:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30275 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB3D500000000001030307) Feb 23 04:36:55 localhost python3.9[268266]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:36:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30276 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB41460000000001030307) Feb 23 04:36:55 localhost python3.9[268377]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:36:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54906 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB44090000000001030307) Feb 23 04:36:56 localhost nova_compute[231721]: 2026-02-23 09:36:55.999 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.130 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.131 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.135 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fe22e74-0508-42eb-ad8e-0a07554970cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.131564', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '317ba12c-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': 'fb630a78feef953a4ba356e4d7faf26d2010e253846eaadba2358a5b7506847a'}]}, 'timestamp': '2026-02-23 09:36:56.135927', '_unique_id': 'cc23c1c32f82485bb69337d6a2196cb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.136 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.137 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.156 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 56720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0d8ff29-4b95-4d8b-87cf-66da369dde2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56720000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:36:56.137917', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '317ec47e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.345447972, 'message_signature': '5ac2c43f68c02900123c6c629c3d47e258cb1182feed293c338325cb9a94dec1'}]}, 'timestamp': '2026-02-23 09:36:56.156501', '_unique_id': 'bb27c4c44556485e92f3004f08973ef0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.157 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.158 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.158 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '226a184a-41d4-48fb-bd46-3cc026909eee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:36:56.158179', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '317f14ec-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.345447972, 'message_signature': '173dcc04f6638a6fccb6cd4c2fad42b79757ae87e31d04b20681ef12f9902439'}]}, 'timestamp': '2026-02-23 09:36:56.158476', '_unique_id': '97a6b95d8dde455482fd8dc4b7632f98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.159 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9662 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6efae7f-da51-4507-b947-20c1e3cea2a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9662, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.159857', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '317f5808-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': 'a24695fb0c128c648c73ede0e5f93cc9a482eecc3041589485f1a9667915cb2e'}]}, 'timestamp': '2026-02-23 09:36:56.160206', '_unique_id': '43cc0b16e18d4c54b383ce44a63a2b22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.160 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac28fa4a-32c7-4470-9fce-197f36a89acc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.161572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '31815e28-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '0280e6b6ac3c35500924e80018f526705465bb67a664841269297b7c02fe7f35'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.161572', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '31816b02-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '0a6443ccc114e2e682604b91a51c8941113946a941ff1d412e832a6b4e9cf4fc'}]}, 'timestamp': '2026-02-23 09:36:56.173780', '_unique_id': '7c068d6c3496411ba6adcc155fa47a9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.174 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.175 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.175 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21c74457-8456-4de2-a053-8b26bb1b166d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.175280', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3181b094-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '43975f29dafb21da3f6d86ec1bb9f329dd47467025c160e290851478384c3215'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.175280', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3181baf8-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '2ffecd826bd9a56fde96f6bd09282b7ef260e6e6d78374221ae94c6b595af5c1'}]}, 'timestamp': '2026-02-23 09:36:56.175820', '_unique_id': '26c08b6162f94b83b89d0795d19bad6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.176 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.177 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.219 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.219 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2922c7ca-f3b1-4b92-902a-e8b2bef52a63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.177411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '31886ba0-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': 'b0fe6b8e261325fe0dd84e42baadffda3914e1845ee0f3e19d9047112205141a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.177411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318884d2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '047027b5330487789cd3e21e63927ed84252428d9dc8a4039bb0142e7c792baf'}]}, 'timestamp': '2026-02-23 09:36:56.220444', '_unique_id': '95ba58919d08417fb41a5abcd7a44cf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.223 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df77c528-cfce-4d25-aae7-5f118ad85d49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.224035', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318929be-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '75c6c27252c919b88f04cd25541b0239b341589566f50bc2f8cc81edb428d38e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.224035', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '31894110-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '7e6a761191f0965ece0c0a9167eeaf8030c8af5a0bffb36758732f1a98f041c8'}]}, 'timestamp': '2026-02-23 09:36:56.225246', '_unique_id': '601809edf277496ca4f8c4105d881e70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7902fe61-2462-4e0c-bc50-8297523882dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.227924', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '3189c1bc-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '9b606888046d2a0c9b3c25c5c13c26650dbcb213fd88e79aafde48a8f01c54e9'}]}, 'timestamp': '2026-02-23 09:36:56.228585', '_unique_id': '60278f4e1bbc48878851076353cccb59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.230 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f54dd6f-5291-43d8-8ddf-053f326b2ea5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.231438', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318a4a38-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '36c7c79255d5c542c51026ead07a870f932a623c9086e91cf86a7a581d0413f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.231438', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318a7080-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': 'e05eb23970af9eb18e69e0bc8f07600b598952bb8a6f6bda9c4fd5b16cca6f56'}]}, 'timestamp': '2026-02-23 09:36:56.233064', '_unique_id': 'fc14d2b0fc144d3e94038d1486d30876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7c3fc40-e31c-411f-b12e-6387c6a42168', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.235627', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318aeef2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '6cc4920a3fe26ff32b15b6f63d26912a6342f392641ed377f80b504d886c024c'}]}, 'timestamp': '2026-02-23 09:36:56.236298', '_unique_id': '47b569c1f10b4206aca46079ef81fe4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa262652-8e53-467e-bef3-0ce2f344ded9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.238812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318b6b98-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '3c74dc0837d2938b893d1a8e104f2371e4407c2ffce6532efeb565f3feee78f1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.238812', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318b824a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': 'cd783402eb565a80a5d0b47d1adfe563fd6672fab3946e055d0bbd3c846a9e30'}]}, 'timestamp': '2026-02-23 09:36:56.240068', '_unique_id': 'bad394cd48bd4a51932cb0a6a3163507'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c63b1f37-3016-443d-ba41-0649b0e00eea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.242344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318bee56-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': '9428253c37a852e4953ef05e8fc314533deb1ddc3d06f019b664cd6e2a7180bc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.242344', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318bf9dc-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.351031403, 'message_signature': 'be27f77adda33f5d3ccaf9502ad0bea85a595e78993caa8b0e993a6cc69c28e8'}]}, 'timestamp': '2026-02-23 09:36:56.243009', '_unique_id': '4f829810c73d4490ad68e333a2d4b6f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '432e7c8f-6c13-48c9-91c8-e804cbc3a987', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.244460', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318c4130-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '83ada7970a2993bf8649c5223ab2450db320ec59982c63b4c9fe678db2f94597'}]}, 'timestamp': '2026-02-23 09:36:56.244838', '_unique_id': '8d8e1f9c01ef4f5fad00be085cac3a2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9de104e-cc4e-43b1-8dfb-940af9885796', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.246398', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318c8ce4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': 'df4e76b5a0ab4d2ecea552d432a9586d17c6608c6e6100230fe37ef37cdb10b0'}]}, 'timestamp': '2026-02-23 09:36:56.246789', '_unique_id': '9671f14319f94c6588f94ff7fbabfe34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd32bf8b6-5923-4bf5-92ae-072e47e52b3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.248420', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318cdbf4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '0c90fb3dd1279ddb3cd12e92550cc34ece2200f4ed195722bcfcb097895f225c'}]}, 'timestamp': '2026-02-23 09:36:56.248840', '_unique_id': 'fa95ed79b77448368e76daf190e8049e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f86618d-3f1a-41cb-b28d-a43dd677034d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.250663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318d34aa-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': 'ffcd6335e1717779e745919d695d92f070a69fcea0f840818412cb22b74af5b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.250663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318d4382-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '1e0370029bdad6b1fe409f1f7c7541232c088fe4c6b4344150a929911d01aa47'}]}, 'timestamp': '2026-02-23 09:36:56.251462', '_unique_id': 'c0488c43a37f484788e9518f1d3fac46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 92 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0994a540-78b4-4256-96e2-cb73d3f75f9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 92, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.253333', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318d9ce2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '40cb207035d8fa98d3f89ad61957ddff35c0d463c7c97e675efd6b9101d0e819'}]}, 'timestamp': '2026-02-23 09:36:56.253779', '_unique_id': '29ce9f392f7c4bc584ff07ba7e43d5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac8668db-7d11-425a-a30e-328e24837146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.255599', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318df44e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '75eff5a9e9d8945a7261784ccf0e373a4044123ca66ef40822fa00e35010759f'}]}, 'timestamp': '2026-02-23 09:36:56.256047', '_unique_id': '48cd00eb5a9143e9a0a0d0ffe12a6aba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cef4dceb-c1f4-4b82-855b-0c8f47b1bc66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:36:56.257829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '318e4c8c-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '85a65c02cdf63765e8ae780a0d8cd0862a94c1a970f7614fd14c024852459599'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:36:56.257829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '318e5b0a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.366874777, 'message_signature': '034a3935a7f6a3f0d62760107d3cc0aafab8f67d82244b0075e5f5cfabd582c9'}]}, 'timestamp': '2026-02-23 09:36:56.258623', '_unique_id': '4dc6e4a167a347708528b5cf671f819f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.260 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41573a41-d2ee-4cad-8844-c41dbfe0b498', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:36:56.260526', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '318eb4ce-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10856.321025666, 'message_signature': '37c97b33f8c6a2f06be72f011c28294474f7ce7afa4e3cbc5b518c4893eb1c91'}]}, 'timestamp': '2026-02-23 09:36:56.260968', '_unique_id': 'f70031d580944536b5d9b90ec0b164db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:36:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:36:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 04:36:56 localhost python3.9[268488]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:36:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30277 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB49470000000001030307) Feb 23 04:36:58 localhost nova_compute[231721]: 2026-02-23 09:36:58.022 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:36:58 localhost python3.9[268656]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:36:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55605 DF PROTO=TCP SPT=54658 DPT=9102 SEQ=846702305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB4E060000000001030307) Feb 23 04:36:59 localhost python3.9[268798]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:36:59 localhost nova_compute[231721]: 2026-02-23 09:36:59.921 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:59 localhost nova_compute[231721]: 2026-02-23 09:36:59.922 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:36:59 localhost nova_compute[231721]: 2026-02-23 09:36:59.922 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:37:00 localhost python3.9[268908]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:00 localhost nova_compute[231721]: 2026-02-23 09:37:00.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:00 localhost nova_compute[231721]: 2026-02-23 09:37:00.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:37:00 localhost nova_compute[231721]: 2026-02-23 09:37:00.540 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:37:01 localhost python3.9[269018]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:01 localhost nova_compute[231721]: 2026-02-23 09:37:01.042 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30278 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB59060000000001030307) Feb 23 04:37:01 localhost nova_compute[231721]: 2026-02-23 09:37:01.423 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:37:01 localhost nova_compute[231721]: 2026-02-23 09:37:01.424 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:37:01 localhost nova_compute[231721]: 2026-02-23 09:37:01.424 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:37:01 localhost nova_compute[231721]: 2026-02-23 09:37:01.424 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:37:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:37:01 localhost podman[269129]: 2026-02-23 09:37:01.596433269 +0000 UTC m=+0.087059513 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 04:37:01 localhost podman[269129]: 2026-02-23 09:37:01.634662198 +0000 UTC m=+0.125288422 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, release=1770267347, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.7, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible) Feb 23 04:37:01 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:37:01 localhost python3.9[269128]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:02 localhost python3.9[269259]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:02 localhost nova_compute[231721]: 2026-02-23 09:37:02.614 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:37:02 localhost nova_compute[231721]: 2026-02-23 09:37:02.635 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:37:02 localhost nova_compute[231721]: 2026-02-23 09:37:02.636 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:37:02 localhost nova_compute[231721]: 2026-02-23 09:37:02.636 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:02 localhost nova_compute[231721]: 2026-02-23 09:37:02.637 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:03 localhost nova_compute[231721]: 2026-02-23 09:37:03.063 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:03 localhost python3.9[269369]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:37:03 localhost nova_compute[231721]: 2026-02-23 09:37:03.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:03 localhost nova_compute[231721]: 2026-02-23 09:37:03.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:04 localhost python3.9[269481]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:04 localhost nova_compute[231721]: 2026-02-23 09:37:04.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:05 localhost nova_compute[231721]: 2026-02-23 09:37:05.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:05 localhost nova_compute[231721]: 2026-02-23 09:37:05.587 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:37:05 localhost nova_compute[231721]: 2026-02-23 09:37:05.587 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:37:05 localhost nova_compute[231721]: 2026-02-23 09:37:05.587 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:37:05 localhost nova_compute[231721]: 2026-02-23 09:37:05.588 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:37:05 localhost nova_compute[231721]: 2026-02-23 09:37:05.588 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.006 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.044 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.075 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.076 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:37:06 localhost python3.9[269613]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.279 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.281 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12159MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.282 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.282 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:37:06 localhost systemd[1]: tmp-crun.jLeBKs.mount: Deactivated successfully. Feb 23 04:37:06 localhost podman[269617]: 2026-02-23 09:37:06.341249355 +0000 UTC m=+0.083918557 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.345 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.346 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.346 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:37:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:37:06 localhost podman[269617]: 2026-02-23 09:37:06.38030767 +0000 UTC m=+0.122976912 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, container_name=ovn_controller) Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.380 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:37:06 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:37:06 localhost podman[269654]: 2026-02-23 09:37:06.438991624 +0000 UTC m=+0.071812597 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:37:06 localhost podman[269654]: 2026-02-23 09:37:06.451085344 +0000 UTC m=+0.083906327 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:37:06 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.812 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.819 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.836 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.839 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:37:06 localhost nova_compute[231721]: 2026-02-23 09:37:06.839 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:37:07 localhost python3.9[269793]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Feb 23 04:37:07 localhost sshd[269904]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:37:07 localhost python3.9[269903]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Feb 23 04:37:08 localhost nova_compute[231721]: 2026-02-23 09:37:08.099 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:08 localhost python3.9[270015]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:37:09 localhost python3.9[270072]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:09 localhost podman[242954]: time="2026-02-23T09:37:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:37:09 localhost podman[242954]: @ - - [23/Feb/2026:09:37:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1" Feb 23 04:37:09 localhost podman[242954]: @ - - [23/Feb/2026:09:37:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16791 "" "Go-http-client/1.1" Feb 23 04:37:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30279 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFB7A060000000001030307) Feb 23 04:37:09 localhost python3.9[270182]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:37:10 localhost podman[270292]: 2026-02-23 09:37:10.589240898 +0000 UTC m=+0.081803143 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:37:10 localhost podman[270292]: 2026-02-23 09:37:10.627969172 +0000 UTC m=+0.120531357 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:37:10 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:37:10 localhost python3.9[270293]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Feb 23 04:37:11 localhost nova_compute[231721]: 2026-02-23 09:37:11.080 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:13 localhost nova_compute[231721]: 2026-02-23 09:37:13.132 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:13 localhost openstack_network_exporter[245358]: ERROR 09:37:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:37:13 localhost openstack_network_exporter[245358]: Feb 23 04:37:13 localhost openstack_network_exporter[245358]: ERROR 09:37:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:37:13 localhost openstack_network_exporter[245358]: Feb 23 04:37:14 localhost sshd[270386]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:37:14 localhost python3.9[270424]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Feb 23 04:37:15 localhost python3.9[270538]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:16 localhost nova_compute[231721]: 2026-02-23 09:37:16.117 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:16 localhost sshd[270630]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:37:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:37:16 localhost podman[270651]: 2026-02-23 09:37:16.736571584 +0000 UTC m=+0.087172157 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:37:16 localhost podman[270651]: 2026-02-23 09:37:16.768083187 +0000 UTC m=+0.118683740 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0) Feb 23 04:37:16 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:37:16 localhost python3.9[270650]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:37:16 localhost systemd[1]: Reloading. Feb 23 04:37:17 localhost systemd-rc-local-generator[270693]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:37:17 localhost systemd-sysv-generator[270698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:17 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:17 localhost python3.9[270812]: ansible-ansible.builtin.service_facts Invoked Feb 23 04:37:17 localhost network[270829]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Feb 23 04:37:17 localhost network[270830]: 'network-scripts' will be removed from distribution in near future. Feb 23 04:37:17 localhost network[270831]: It is advised to switch to 'NetworkManager' instead for network management. Feb 23 04:37:18 localhost nova_compute[231721]: 2026-02-23 09:37:18.177 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:19 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:37:21 localhost nova_compute[231721]: 2026-02-23 09:37:21.155 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:37:22 localhost podman[270971]: 2026-02-23 09:37:22.911067172 +0000 UTC m=+0.082826864 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:37:22 localhost podman[270971]: 2026-02-23 09:37:22.923831073 +0000 UTC m=+0.095590825 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:37:22 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:37:23 localhost nova_compute[231721]: 2026-02-23 09:37:23.206 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29761 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBB2800000000001030307) Feb 23 04:37:24 localhost python3.9[271084]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:24 localhost python3.9[271195]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29762 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBB6860000000001030307) Feb 23 04:37:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30280 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBBA060000000001030307) Feb 23 04:37:26 localhost nova_compute[231721]: 2026-02-23 09:37:26.185 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:26 localhost python3.9[271306]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:27 localhost python3.9[271417]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29763 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBBE870000000001030307) Feb 23 04:37:27 localhost python3.9[271528]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54907 DF PROTO=TCP SPT=45788 DPT=9102 SEQ=998281806 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBC2060000000001030307) Feb 23 04:37:28 localhost nova_compute[231721]: 2026-02-23 09:37:28.243 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:28 localhost python3.9[271639]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:29 localhost python3.9[271750]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:29 localhost python3.9[271861]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:37:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29764 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBCE460000000001030307) Feb 23 04:37:31 localhost nova_compute[231721]: 2026-02-23 09:37:31.229 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:37:31 localhost podman[271973]: 2026-02-23 09:37:31.856896522 +0000 UTC m=+0.086381352 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter) Feb 23 04:37:31 localhost podman[271973]: 2026-02-23 09:37:31.872279703 +0000 UTC m=+0.101764553 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1770267347, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:37:31 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:37:31 localhost python3.9[271972]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:32 localhost python3.9[272102]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:33 localhost python3.9[272212]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:33 localhost nova_compute[231721]: 2026-02-23 09:37:33.281 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:33 localhost python3.9[272322]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:35 localhost python3.9[272432]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:35 localhost python3.9[272542]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:36 localhost nova_compute[231721]: 2026-02-23 09:37:36.229 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:37:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:37:36 localhost podman[272653]: 2026-02-23 09:37:36.896904315 +0000 UTC m=+0.078294134 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:37:36 localhost podman[272653]: 2026-02-23 09:37:36.910313345 +0000 UTC m=+0.091703224 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:37:36 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:37:37 localhost podman[272652]: 2026-02-23 09:37:37.001807144 +0000 UTC m=+0.186995120 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 23 04:37:37 localhost python3.9[272654]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:37 localhost podman[272652]: 2026-02-23 09:37:37.073346141 +0000 UTC m=+0.258534177 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0) Feb 23 04:37:37 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:37:37 localhost python3.9[272811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:38 localhost nova_compute[231721]: 2026-02-23 09:37:38.311 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:38 localhost python3.9[272921]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:39 localhost python3.9[273031]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29765 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFBEE060000000001030307) Feb 23 04:37:39 localhost podman[242954]: time="2026-02-23T09:37:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:37:39 localhost podman[242954]: @ - - [23/Feb/2026:09:37:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1" Feb 23 04:37:39 localhost podman[242954]: @ - - [23/Feb/2026:09:37:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16793 "" "Go-http-client/1.1" Feb 23 04:37:39 localhost python3.9[273141]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:40 localhost python3.9[273251]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:40 localhost python3.9[273361]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:37:40 localhost podman[273379]: 2026-02-23 09:37:40.909602672 +0000 UTC m=+0.085369253 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible) Feb 23 04:37:40 localhost podman[273379]: 2026-02-23 09:37:40.925327872 +0000 UTC m=+0.101094433 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:37:40 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:37:41 localhost python3.9[273488]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:41 localhost nova_compute[231721]: 2026-02-23 09:37:41.264 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:41 localhost python3.9[273598]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:42 localhost python3.9[273708]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:37:43 localhost nova_compute[231721]: 2026-02-23 09:37:43.343 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:43 localhost openstack_network_exporter[245358]: ERROR 09:37:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:37:43 localhost openstack_network_exporter[245358]: Feb 23 04:37:43 localhost openstack_network_exporter[245358]: ERROR 09:37:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:37:43 localhost openstack_network_exporter[245358]: Feb 23 04:37:43 localhost python3.9[273818]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:43 localhost sshd[273876]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:37:44 localhost python3.9[273930]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Feb 23 04:37:46 localhost python3.9[274040]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Feb 23 04:37:46 localhost systemd[1]: Reloading. Feb 23 04:37:46 localhost nova_compute[231721]: 2026-02-23 09:37:46.294 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:46 localhost systemd-rc-local-generator[274065]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:37:46 localhost systemd-sysv-generator[274070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:37:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:37:46 localhost podman[274094]: 2026-02-23 09:37:46.910333106 +0000 UTC m=+0.087368513 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:37:46 localhost podman[274094]: 2026-02-23 09:37:46.948196963 +0000 UTC m=+0.125232330 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent) Feb 23 04:37:46 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:37:47 localhost python3.9[274203]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:48 localhost nova_compute[231721]: 2026-02-23 09:37:48.379 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:48 localhost python3.9[274314]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:37:48.538 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:37:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:37:48.539 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:37:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:37:48.540 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:37:48 localhost python3.9[274425]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:49 localhost python3.9[274536]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:50 localhost python3.9[274647]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:50 localhost python3.9[274758]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:51 localhost nova_compute[231721]: 2026-02-23 09:37:51.327 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:51 localhost python3.9[274869]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:52 localhost python3.9[274980]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:37:53 localhost nova_compute[231721]: 2026-02-23 09:37:53.412 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:53 localhost nova_compute[231721]: 2026-02-23 09:37:53.540 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:53 localhost nova_compute[231721]: 2026-02-23 09:37:53.541 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 23 04:37:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:37:53 localhost podman[274999]: 2026-02-23 09:37:53.910255149 +0000 UTC m=+0.084176477 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:37:53 localhost podman[274999]: 2026-02-23 09:37:53.923235703 +0000 UTC m=+0.097157021 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:37:53 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:37:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64118 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC27B00000000001030307) Feb 23 04:37:54 localhost python3.9[275112]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:37:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64119 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC2BC60000000001030307) Feb 23 04:37:55 localhost nova_compute[231721]: 2026-02-23 09:37:55.574 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:55 localhost nova_compute[231721]: 2026-02-23 09:37:55.574 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 23 04:37:55 localhost nova_compute[231721]: 2026-02-23 09:37:55.592 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 23 04:37:55 localhost python3.9[275222]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:37:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29766 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC2E060000000001030307) Feb 23 04:37:56 localhost sshd[275328]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:37:56 localhost nova_compute[231721]: 2026-02-23 09:37:56.357 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:56 localhost python3.9[275334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:37:57 localhost python3.9[275444]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:37:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64120 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC33C60000000001030307) Feb 23 04:37:58 localhost python3.9[275554]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:37:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30281 DF PROTO=TCP SPT=45566 DPT=9102 SEQ=244674139 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC38070000000001030307) Feb 23 04:37:58 localhost nova_compute[231721]: 2026-02-23 09:37:58.450 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:37:59 localhost python3.9[275664]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Feb 23 04:37:59 localhost nova_compute[231721]: 2026-02-23 09:37:59.559 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:59 localhost nova_compute[231721]: 2026-02-23 09:37:59.559 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:37:59 localhost nova_compute[231721]: 2026-02-23 09:37:59.560 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:38:00 localhost python3.9[275842]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:00 localhost nova_compute[231721]: 2026-02-23 09:38:00.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:00 localhost python3.9[275952]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64121 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC43870000000001030307) Feb 23 04:38:01 localhost python3.9[276062]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:01 localhost nova_compute[231721]: 2026-02-23 09:38:01.399 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:02 localhost nova_compute[231721]: 2026-02-23 09:38:02.541 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:02 localhost nova_compute[231721]: 2026-02-23 09:38:02.542 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:38:02 localhost nova_compute[231721]: 2026-02-23 09:38:02.542 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:38:02 localhost nova_compute[231721]: 2026-02-23 09:38:02.755 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:38:02 localhost nova_compute[231721]: 2026-02-23 09:38:02.755 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:38:02 localhost nova_compute[231721]: 2026-02-23 09:38:02.755 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:38:02 localhost nova_compute[231721]: 2026-02-23 09:38:02.756 231725 DEBUG nova.objects.instance [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:38:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:38:02 localhost podman[276080]: 2026-02-23 09:38:02.918313634 +0000 UTC m=+0.092481377 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1770267347) Feb 23 04:38:02 localhost podman[276080]: 2026-02-23 09:38:02.955857555 +0000 UTC m=+0.130025328 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, release=1770267347, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7) Feb 23 04:38:02 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:38:03 localhost nova_compute[231721]: 2026-02-23 09:38:03.490 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:04 localhost nova_compute[231721]: 2026-02-23 09:38:04.661 231725 DEBUG nova.network.neutron [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:38:04 localhost nova_compute[231721]: 2026-02-23 09:38:04.688 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:38:04 localhost nova_compute[231721]: 2026-02-23 09:38:04.688 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:38:04 localhost nova_compute[231721]: 2026-02-23 09:38:04.689 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:04 localhost nova_compute[231721]: 2026-02-23 09:38:04.690 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:04 localhost nova_compute[231721]: 2026-02-23 09:38:04.690 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.555 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.556 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.582 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.582 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.604 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.604 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.605 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.605 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:38:05 localhost nova_compute[231721]: 2026-02-23 09:38:05.605 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.058 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.130 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.131 231725 DEBUG nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.335 231725 WARNING nova.virt.libvirt.driver [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.336 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12171MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.337 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.337 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.400 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.548 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.548 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.549 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.620 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.693 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.693 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.717 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.744 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_AVX2,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_AVX,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NODE,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,COMPUTE_VOLUME_ATTACH_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 04:38:06 localhost nova_compute[231721]: 2026-02-23 09:38:06.993 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:38:07 localhost nova_compute[231721]: 2026-02-23 09:38:07.440 231725 DEBUG oslo_concurrency.processutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:38:07 localhost nova_compute[231721]: 2026-02-23 09:38:07.447 231725 DEBUG nova.compute.provider_tree [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:38:07 localhost nova_compute[231721]: 2026-02-23 09:38:07.475 231725 DEBUG nova.scheduler.client.report [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:38:07 localhost nova_compute[231721]: 2026-02-23 09:38:07.478 231725 DEBUG nova.compute.resource_tracker [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:38:07 localhost nova_compute[231721]: 2026-02-23 09:38:07.478 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:38:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:38:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:38:07 localhost podman[276162]: 2026-02-23 09:38:07.907815412 +0000 UTC m=+0.079809570 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller) Feb 23 04:38:07 localhost podman[276162]: 2026-02-23 09:38:07.97827991 +0000 UTC m=+0.150274078 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true) Feb 23 04:38:07 localhost podman[276163]: 2026-02-23 09:38:07.991825803 +0000 UTC m=+0.161025734 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:38:07 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:38:07 localhost podman[276163]: 2026-02-23 09:38:07.999298187 +0000 UTC m=+0.168498148 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:38:08 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:38:08 localhost nova_compute[231721]: 2026-02-23 09:38:08.492 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:09 localhost podman[242954]: time="2026-02-23T09:38:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:38:09 localhost podman[242954]: @ - - [23/Feb/2026:09:38:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149683 "" "Go-http-client/1.1" Feb 23 04:38:09 localhost podman[242954]: @ - - [23/Feb/2026:09:38:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1" Feb 23 04:38:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64122 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC64070000000001030307) Feb 23 04:38:09 localhost python3.9[276300]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Feb 23 04:38:11 localhost sshd[276319]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:38:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:38:11 localhost systemd-logind[759]: New session 60 of user zuul. Feb 23 04:38:11 localhost systemd[1]: Started Session 60 of User zuul. Feb 23 04:38:11 localhost podman[276321]: 2026-02-23 09:38:11.266961958 +0000 UTC m=+0.095633174 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:38:11 localhost podman[276321]: 2026-02-23 09:38:11.307432341 +0000 UTC m=+0.136103527 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 23 04:38:11 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:38:11 localhost systemd[1]: session-60.scope: Deactivated successfully. Feb 23 04:38:11 localhost systemd-logind[759]: Session 60 logged out. Waiting for processes to exit. Feb 23 04:38:11 localhost systemd-logind[759]: Removed session 60. Feb 23 04:38:11 localhost nova_compute[231721]: 2026-02-23 09:38:11.440 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:12 localhost python3.9[276450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:38:12 localhost python3.9[276505]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:12 localhost python3.9[276613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:38:13 localhost openstack_network_exporter[245358]: ERROR 09:38:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:38:13 localhost openstack_network_exporter[245358]: Feb 23 04:38:13 localhost openstack_network_exporter[245358]: ERROR 09:38:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:38:13 localhost openstack_network_exporter[245358]: Feb 23 04:38:13 localhost nova_compute[231721]: 2026-02-23 09:38:13.494 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:13 localhost python3.9[276699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839492.5864537-2355-125696160685702/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:14 localhost python3.9[276808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:38:14 localhost sshd[276809]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:38:14 localhost python3.9[276896]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839493.672255-2355-194442543401136/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:15 localhost python3.9[277004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:38:15 localhost python3.9[277090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839494.7546844-2355-278485605091449/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:16 localhost python3.9[277198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:38:16 localhost nova_compute[231721]: 2026-02-23 09:38:16.482 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:16 localhost python3.9[277284]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771839496.018215-2517-249906156195919/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=3012482a375a6db0cadffa2656b647c3720d54e9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:38:17 localhost systemd[1]: tmp-crun.gm4S5U.mount: Deactivated successfully. Feb 23 04:38:17 localhost podman[277395]: 2026-02-23 09:38:17.546964565 +0000 UTC m=+0.086525870 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:38:17 localhost podman[277395]: 2026-02-23 09:38:17.577050584 +0000 UTC m=+0.116611899 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, container_name=ovn_metadata_agent) Feb 23 04:38:17 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:38:17 localhost python3.9[277394]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:18 localhost python3.9[277523]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:18 localhost nova_compute[231721]: 2026-02-23 09:38:18.530 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:18 localhost sshd[277557]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:38:19 localhost nova_compute[231721]: 2026-02-23 09:38:19.565 231725 DEBUG oslo_service.periodic_task [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:38:19 localhost nova_compute[231721]: 2026-02-23 09:38:19.587 231725 DEBUG nova.compute.manager [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 23 04:38:19 localhost nova_compute[231721]: 2026-02-23 09:38:19.588 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:38:19 localhost nova_compute[231721]: 2026-02-23 09:38:19.588 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:38:19 localhost python3.9[277635]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:38:19 localhost nova_compute[231721]: 2026-02-23 09:38:19.629 231725 DEBUG oslo_concurrency.lockutils [None req-7ac36a78-52de-4e8c-8e36-dffbffed9664 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:38:20 localhost python3.9[277747]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:21 localhost nova_compute[231721]: 2026-02-23 09:38:21.513 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:21 localhost python3.9[277855]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:38:22 localhost python3.9[277967]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:23 localhost python3.9[278077]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:23 localhost nova_compute[231721]: 2026-02-23 09:38:23.569 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23718 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFC9CE00000000001030307) Feb 23 04:38:24 localhost python3.9[278185]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:38:24 localhost podman[278285]: 2026-02-23 09:38:24.908323699 +0000 UTC m=+0.079706419 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:38:24 localhost podman[278285]: 2026-02-23 09:38:24.918484735 +0000 UTC m=+0.089867425 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:38:24 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:38:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23719 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCA1060000000001030307) Feb 23 04:38:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64123 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCA4060000000001030307) Feb 23 04:38:26 localhost nova_compute[231721]: 2026-02-23 09:38:26.536 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:26 localhost python3.9[278514]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False Feb 23 04:38:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23720 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCA9060000000001030307) Feb 23 04:38:27 localhost python3.9[278624]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 23 04:38:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29767 DF PROTO=TCP SPT=56496 DPT=9102 SEQ=2296097628 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCAC060000000001030307) Feb 23 04:38:28 localhost nova_compute[231721]: 2026-02-23 09:38:28.620 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:28 localhost python3[278734]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False Feb 23 04:38:29 localhost python3[278734]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3",#012 "Digest": "sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-02-23T06:27:42.035349623Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1216089983,#012 "VirtualSize": 1216089983,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",#012 "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",#012 "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",#012 "sha256:5511acb0625eca242fd47549a8bafd7826358a029c48a9158ddd6fa2b7e0b86d",#012 "sha256:1f1e90f8b2058c74071fe0298f6d20f4d1edbde3bdd940d26fcd35c036f677a8"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-02-17T01:25:07.246646992Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:07.246739119Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260216\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:12.132997501Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-02-23T06:08:39.081651802Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081666472Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081677733Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081688343Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081701553Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081710413Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.413481757Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:09:13.490649497Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 23 04:38:29 localhost podman[278784]: 2026-02-23 09:38:29.275360598 +0000 UTC m=+0.068392265 container remove 29fbf3e6d165ac37a2073e9d11df0954b5b34530c2d4564677cda92707f802aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=nova_compute_init, managed_by=edpm_ansible) Feb 23 04:38:29 localhost python3[278734]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute_init Feb 23 04:38:29 localhost podman[278798]: Feb 23 04:38:29 localhost podman[278798]: 2026-02-23 09:38:29.383198232 +0000 UTC m=+0.088859203 container create 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=nova_compute_init, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0) Feb 23 04:38:29 localhost podman[278798]: 2026-02-23 09:38:29.340094487 +0000 UTC m=+0.045755458 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 23 04:38:29 localhost python3[278734]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369 --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Feb 23 04:38:30 localhost python3.9[278946]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:38:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23721 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCB8C70000000001030307) Feb 23 04:38:31 localhost python3.9[279056]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 23 04:38:31 localhost nova_compute[231721]: 2026-02-23 09:38:31.566 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:32 localhost python3.9[279166]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:38:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:38:33 localhost systemd[1]: tmp-crun.SInQ7A.mount: Deactivated successfully. Feb 23 04:38:33 localhost podman[279257]: 2026-02-23 09:38:33.153421592 +0000 UTC m=+0.100590770 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1770267347, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 04:38:33 localhost podman[279257]: 2026-02-23 09:38:33.193527993 +0000 UTC m=+0.140697101 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.openshift.expose-services=) Feb 23 04:38:33 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:38:33 localhost python3.9[279256]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839512.207842-2988-172180113568438/.source.yaml _original_basename=.tfw04s0h follow=False checksum=f9aa9ce623bd0367523b1516d0fd40e0aad40b65 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:33 localhost nova_compute[231721]: 2026-02-23 09:38:33.666 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:34 localhost python3.9[279386]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:35 localhost python3.9[279496]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Feb 23 04:38:36 localhost python3.9[279606]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:38:36 localhost nova_compute[231721]: 2026-02-23 09:38:36.614 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:36 localhost python3.9[279663]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/nova_compute.json _original_basename=.slfkserr recurse=False state=file path=/var/lib/kolla/config_files/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:37 localhost sshd[279697]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:38:37 localhost python3.9[279773]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:38 localhost nova_compute[231721]: 2026-02-23 09:38:38.701 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:38:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:38:38 localhost podman[279898]: 2026-02-23 09:38:38.920063184 +0000 UTC m=+0.081534495 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:38:38 localhost podman[279898]: 2026-02-23 09:38:38.965328936 +0000 UTC m=+0.126800217 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:38:38 localhost systemd[1]: tmp-crun.TiUc46.mount: Deactivated successfully. Feb 23 04:38:38 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:38:38 localhost podman[279901]: 2026-02-23 09:38:38.987990563 +0000 UTC m=+0.145423648 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:38:39 localhost podman[279901]: 2026-02-23 09:38:39.001251447 +0000 UTC m=+0.158684542 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:38:39 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:38:39 localhost podman[242954]: time="2026-02-23T09:38:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:38:39 localhost podman[242954]: @ - - [23/Feb/2026:09:38:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149689 "" "Go-http-client/1.1" Feb 23 04:38:39 localhost podman[242954]: @ - - [23/Feb/2026:09:38:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16790 "" "Go-http-client/1.1" Feb 23 04:38:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23722 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFCDA070000000001030307) Feb 23 04:38:40 localhost python3.9[280126]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False Feb 23 04:38:41 localhost python3.9[280236]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Feb 23 04:38:41 localhost nova_compute[231721]: 2026-02-23 09:38:41.668 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:38:41 localhost systemd[1]: tmp-crun.Ejy2ld.mount: Deactivated successfully. Feb 23 04:38:41 localhost podman[280325]: 2026-02-23 09:38:41.921935893 +0000 UTC m=+0.098008489 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:38:41 localhost podman[280325]: 2026-02-23 09:38:41.958487543 +0000 UTC m=+0.134560179 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 23 04:38:41 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:38:42 localhost python3[280360]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False Feb 23 04:38:42 localhost python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "72feed39d002da96e9458f5df3225bc8b72f1ae28f906a4ea01e253f86aab9e3",#012 "Digest": "sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:60339e5e0cd7bfe18718bee79174c18ef91b932586fd96f01b9799d5d120385d"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2026-02-23T06:27:42.035349623Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1216089983,#012 "VirtualSize": 1216089983,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/239567307c66a10c4dd721df6a9263fcc38501437d275d2b4907c616b635d111/diff:/var/lib/containers/storage/overlay/0455f1f13172510bfb03afa514ad1dc5f28a2039a4c0ae85e44e0cde63814ca4/diff:/var/lib/containers/storage/overlay/882df85a0cf43e46bc799aafd5ff81035654b304c2fef5dbd26c9dd0c2e9fcc3/diff:/var/lib/containers/storage/overlay/d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/7e5a6b3af0e35b266ef2f57ba1f524615772066427004655f3e99c4f9072865c/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:d9f14c75a7289cf010d2e5175c554193dba109f864fe39fc418f3bc5b90efe9d",#012 "sha256:6eb5d45c6942983139aec78264b4b68bafe46465bb40e2bb4c09e78dad8ba6c0",#012 "sha256:9a59f9675e4fdfdb0eaa24dcce26bed374feef6430ea888b6f5ef1274a95bd90",#012 "sha256:5511acb0625eca242fd47549a8bafd7826358a029c48a9158ddd6fa2b7e0b86d",#012 "sha256:1f1e90f8b2058c74071fe0298f6d20f4d1edbde3bdd940d26fcd35c036f677a8"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.43.0",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20260216",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "8419493e1fd846703d277695e03fc5eb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2026-02-17T01:25:07.246646992Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:d064f128d9bf147a386d5c0e8c2e8a6f698c81fb4e2404e09afe5ef1e1d3b529 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:07.246739119Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20260216\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-17T01:25:12.132997501Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2026-02-23T06:08:39.081651802Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081666472Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081677733Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081688343Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081701553Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.081710413Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:08:39.413481757Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2026-02-23T06:09:13.490649497Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Feb 23 04:38:42 localhost nova_compute[231721]: 2026-02-23 09:38:42.576 231725 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Feb 23 04:38:43 localhost openstack_network_exporter[245358]: ERROR 09:38:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:38:43 localhost openstack_network_exporter[245358]: Feb 23 04:38:43 localhost openstack_network_exporter[245358]: ERROR 09:38:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:38:43 localhost openstack_network_exporter[245358]: Feb 23 04:38:43 localhost nova_compute[231721]: 2026-02-23 09:38:43.702 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:46 localhost nova_compute[231721]: 2026-02-23 09:38:46.670 231725 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:46 localhost nova_compute[231721]: 2026-02-23 09:38:46.681 231725 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Feb 23 04:38:46 localhost nova_compute[231721]: 2026-02-23 09:38:46.683 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:38:46 localhost nova_compute[231721]: 2026-02-23 09:38:46.684 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:38:46 localhost nova_compute[231721]: 2026-02-23 09:38:46.684 231725 DEBUG oslo_concurrency.lockutils [None req-aa0441fe-3835-4537-90d6-e0b40aee90c2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:38:47 localhost journal[207530]: End of file while reading data: Input/output error Feb 23 04:38:47 localhost systemd[1]: libpod-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d.scope: Deactivated successfully. Feb 23 04:38:47 localhost systemd[1]: libpod-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d.scope: Consumed 18.106s CPU time. Feb 23 04:38:47 localhost podman[280415]: 2026-02-23 09:38:47.06498717 +0000 UTC m=+4.555172228 container died 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:38:47 localhost systemd[1]: tmp-crun.iUVFug.mount: Deactivated successfully. Feb 23 04:38:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d-userdata-shm.mount: Deactivated successfully. Feb 23 04:38:47 localhost podman[280415]: 2026-02-23 09:38:47.203825291 +0000 UTC m=+4.694010349 container cleanup 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Feb 23 04:38:47 localhost python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman stop nova_compute Feb 23 04:38:47 localhost podman[280429]: 2026-02-23 09:38:47.21884758 +0000 UTC m=+0.141282549 container cleanup 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.43.0) Feb 23 04:38:47 localhost podman[280444]: 2026-02-23 09:38:47.317978482 +0000 UTC m=+0.091675150 container remove 8d27414cca68da82346fae1fc6c4ecb36a1a0e33cd4571c19621f2f476697e2d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-5f6dcf25a4eb712d7b55775b8e130167254d53f3f84c8303f8f39f30426e780b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:38:47 localhost podman[280450]: Error: no container with name or ID "nova_compute" found: no such container Feb 23 04:38:47 localhost systemd[1]: edpm_nova_compute.service: Control process exited, code=exited, status=125/n/a Feb 23 04:38:47 localhost python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Feb 23 04:38:47 localhost podman[280471]: Error: no container with name or ID "nova_compute" found: no such container Feb 23 04:38:47 localhost podman[280472]: Feb 23 04:38:47 localhost systemd[1]: edpm_nova_compute.service: Control process exited, code=exited, status=125/n/a Feb 23 04:38:47 localhost systemd[1]: edpm_nova_compute.service: Failed with result 'exit-code'. Feb 23 04:38:47 localhost podman[280472]: 2026-02-23 09:38:47.415546626 +0000 UTC m=+0.068094265 container create 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260216, container_name=nova_compute, config_id=nova_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true) Feb 23 04:38:47 localhost podman[280472]: 2026-02-23 09:38:47.37818045 +0000 UTC m=+0.030728069 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Feb 23 04:38:47 localhost python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369 --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Feb 23 04:38:47 localhost systemd[1]: edpm_nova_compute.service: Scheduled restart job, restart counter is at 1. Feb 23 04:38:47 localhost systemd[1]: Started libpod-conmon-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b.scope. Feb 23 04:38:47 localhost systemd[1]: Stopped nova_compute container. Feb 23 04:38:47 localhost systemd[1]: Starting nova_compute container... Feb 23 04:38:47 localhost systemd[1]: Started libcrun container. Feb 23 04:38:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 23 04:38:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 23 04:38:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 04:38:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 04:38:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 23 04:38:47 localhost podman[280497]: 2026-02-23 09:38:47.570983996 +0000 UTC m=+0.137793991 container init 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=nova_compute, container_name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 04:38:47 localhost podman[280497]: 2026-02-23 09:38:47.584673533 +0000 UTC m=+0.151483518 container start 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216) Feb 23 04:38:47 localhost nova_compute[280512]: + sudo -E kolla_set_configs Feb 23 04:38:47 localhost python3[280360]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman start nova_compute Feb 23 04:38:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:38:47 localhost systemd[1]: Started nova_compute container. Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Validating config file Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying service configuration files Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Deleting /etc/ceph Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Creating directory /etc/ceph Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Writing out command to execute Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:38:47 localhost nova_compute[280512]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 23 04:38:47 localhost nova_compute[280512]: ++ cat /run_command Feb 23 04:38:47 localhost nova_compute[280512]: + CMD=nova-compute Feb 23 04:38:47 localhost nova_compute[280512]: + ARGS= Feb 23 04:38:47 localhost nova_compute[280512]: + sudo kolla_copy_cacerts Feb 23 04:38:47 localhost nova_compute[280512]: + [[ ! -n '' ]] Feb 23 04:38:47 localhost nova_compute[280512]: + . kolla_extend_start Feb 23 04:38:47 localhost nova_compute[280512]: + echo 'Running command: '\''nova-compute'\''' Feb 23 04:38:47 localhost nova_compute[280512]: Running command: 'nova-compute' Feb 23 04:38:47 localhost nova_compute[280512]: + umask 0022 Feb 23 04:38:47 localhost nova_compute[280512]: + exec nova-compute Feb 23 04:38:47 localhost podman[280541]: 2026-02-23 09:38:47.735658342 +0000 UTC m=+0.086670965 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:38:47 localhost podman[280541]: 2026-02-23 09:38:47.817132574 +0000 UTC m=+0.168145217 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:38:47 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:38:48 localhost systemd[1]: var-lib-containers-storage-overlay-81751aaa607236c95e65d1a8738b503c5aac60039da98c3685192990b6f247b4-merged.mount: Deactivated successfully. Feb 23 04:38:48 localhost python3.9[280686]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:38:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:38:48.539 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:38:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:38:48.541 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:38:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:38:48.542 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:38:49 localhost python3.9[280799]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.292 280526 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.293 280526 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.293 280526 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.293 280526 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.412 280526 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.435 280526 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.435 280526 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 23 04:38:49 localhost python3.9[280858]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.820 280526 INFO nova.virt.driver [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.933 280526 INFO nova.compute.provider_config [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.943 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.944 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.944 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.944 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.945 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.946 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] console_host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.947 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.948 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.949 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.950 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.951 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.952 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.953 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.954 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.955 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.956 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.957 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.958 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.959 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.960 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.961 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.962 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.963 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.964 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.965 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.966 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.967 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.968 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.969 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.970 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.971 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.972 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.973 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.973 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.973 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.973 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.974 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.974 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.974 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.974 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.975 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.976 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.977 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.978 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.979 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.980 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.981 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.982 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.983 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.984 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.985 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.986 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.987 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.988 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.989 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.990 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.991 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.992 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.993 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.994 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.995 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.996 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.997 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.998 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:49 localhost nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:49.999 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.000 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.barbican_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.001 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.002 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.003 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.004 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.005 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.006 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.007 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.008 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.009 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.010 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.011 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.012 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.013 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.013 280526 WARNING oslo_config.cfg [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 23 04:38:50 localhost nova_compute[280512]: live_migration_uri is deprecated for removal in favor of two other options that Feb 23 04:38:50 localhost nova_compute[280512]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 23 04:38:50 localhost nova_compute[280512]: and ``live_migration_inbound_addr`` respectively. Feb 23 04:38:50 localhost nova_compute[280512]: ). Its value may be silently ignored in the future.#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.014 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.015 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_secret_uuid = f1fea371-cb69-578d-a3d0-b5c472a84b46 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.016 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.017 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.018 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.019 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.020 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.021 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.022 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.023 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.024 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.025 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.026 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.027 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.028 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.029 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.030 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.031 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.032 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.033 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.034 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.035 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.036 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.037 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.038 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.039 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.040 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.041 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.042 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.043 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.044 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.045 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.046 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.047 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.048 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.049 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.050 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.051 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.052 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.053 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.054 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.055 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.056 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.057 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.058 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.059 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.060 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.061 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.062 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.063 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.064 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.065 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.066 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.067 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.068 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.069 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.070 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.071 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.071 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.071 280526 DEBUG oslo_service.service [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.072 280526 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.089 280526 INFO nova.virt.node [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.089 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.090 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.090 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.090 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.101 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.103 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.104 280526 INFO nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Connection event '1' reason 'None'#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.110 280526 INFO nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host capabilities Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: bdcaa433-cfc7-450a-99ab-f0985ab59447 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: x86_64 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v4 Feb 23 04:38:50 localhost nova_compute[280512]: AMD Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: tcp Feb 23 04:38:50 localhost nova_compute[280512]: rdma Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 16116612 Feb 23 04:38:50 localhost nova_compute[280512]: 4029153 Feb 23 04:38:50 localhost nova_compute[280512]: 0 Feb 23 04:38:50 localhost nova_compute[280512]: 0 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: selinux Feb 23 04:38:50 localhost nova_compute[280512]: 0 Feb 23 04:38:50 localhost nova_compute[280512]: system_u:system_r:svirt_t:s0 Feb 23 04:38:50 localhost nova_compute[280512]: system_u:system_r:svirt_tcg_t:s0 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: dac Feb 23 04:38:50 localhost nova_compute[280512]: 0 Feb 23 04:38:50 localhost nova_compute[280512]: +107:+107 Feb 23 04:38:50 localhost nova_compute[280512]: +107:+107 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: hvm Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 32 Feb 23 04:38:50 localhost nova_compute[280512]: /usr/libexec/qemu-kvm Feb 23 04:38:50 localhost nova_compute[280512]: pc-i440fx-rhel7.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.8.0 Feb 23 04:38:50 localhost nova_compute[280512]: q35 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.4.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.5.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.3.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel7.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.4.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.2.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.2.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.0.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.0.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.1.0 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: hvm Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 64 Feb 23 04:38:50 localhost nova_compute[280512]: /usr/libexec/qemu-kvm Feb 23 04:38:50 localhost nova_compute[280512]: pc-i440fx-rhel7.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.8.0 Feb 23 04:38:50 localhost nova_compute[280512]: q35 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.4.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.5.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.3.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel7.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.4.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.2.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.2.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.0.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.0.0 Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel8.1.0 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: #033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.118 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.123 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: /usr/libexec/qemu-kvm Feb 23 04:38:50 localhost nova_compute[280512]: kvm Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.8.0 Feb 23 04:38:50 localhost nova_compute[280512]: i686 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: rom Feb 23 04:38:50 localhost nova_compute[280512]: pflash Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: yes Feb 23 04:38:50 localhost nova_compute[280512]: no Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: no Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome Feb 23 04:38:50 localhost nova_compute[280512]: AMD Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 486 Feb 23 04:38:50 localhost nova_compute[280512]: 486-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ClearwaterForest Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ClearwaterForest-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Conroe Feb 23 04:38:50 localhost nova_compute[280512]: Conroe-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-IBPB Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v4 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v5 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Turin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Turin-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v1 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v2 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v6 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v7 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: KnightsMill Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: KnightsMill-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G1-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G2 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G2-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G3 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G3-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G4-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G5-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Penryn Feb 23 04:38:50 localhost nova_compute[280512]: Penryn-v1 Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Westmere Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-v2 Feb 23 04:38:50 localhost nova_compute[280512]: athlon Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: athlon-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: core2duo Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: core2duo-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: coreduo Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: coreduo-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: kvm32 Feb 23 04:38:50 localhost nova_compute[280512]: kvm32-v1 Feb 23 04:38:50 localhost nova_compute[280512]: kvm64 Feb 23 04:38:50 localhost nova_compute[280512]: kvm64-v1 Feb 23 04:38:50 localhost nova_compute[280512]: n270 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: n270-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: pentium Feb 23 04:38:50 localhost nova_compute[280512]: pentium-v1 Feb 23 04:38:50 localhost nova_compute[280512]: pentium2 Feb 23 04:38:50 localhost nova_compute[280512]: pentium2-v1 Feb 23 04:38:50 localhost nova_compute[280512]: pentium3 Feb 23 04:38:50 localhost nova_compute[280512]: pentium3-v1 Feb 23 04:38:50 localhost nova_compute[280512]: phenom Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: phenom-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: qemu32 Feb 23 04:38:50 localhost nova_compute[280512]: qemu32-v1 Feb 23 04:38:50 localhost nova_compute[280512]: qemu64 Feb 23 04:38:50 localhost nova_compute[280512]: qemu64-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: file Feb 23 04:38:50 localhost nova_compute[280512]: anonymous Feb 23 04:38:50 localhost nova_compute[280512]: memfd Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: disk Feb 23 04:38:50 localhost nova_compute[280512]: cdrom Feb 23 04:38:50 localhost nova_compute[280512]: floppy Feb 23 04:38:50 localhost nova_compute[280512]: lun Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: fdc Feb 23 04:38:50 localhost nova_compute[280512]: scsi Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: sata Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: virtio-transitional Feb 23 04:38:50 localhost nova_compute[280512]: virtio-non-transitional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: vnc Feb 23 04:38:50 localhost nova_compute[280512]: egl-headless Feb 23 04:38:50 localhost nova_compute[280512]: dbus Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: subsystem Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: default Feb 23 04:38:50 localhost nova_compute[280512]: mandatory Feb 23 04:38:50 localhost nova_compute[280512]: requisite Feb 23 04:38:50 localhost nova_compute[280512]: optional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: pci Feb 23 04:38:50 localhost nova_compute[280512]: scsi Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: virtio-transitional Feb 23 04:38:50 localhost nova_compute[280512]: virtio-non-transitional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: random Feb 23 04:38:50 localhost nova_compute[280512]: egd Feb 23 04:38:50 localhost nova_compute[280512]: builtin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: path Feb 23 04:38:50 localhost nova_compute[280512]: handle Feb 23 04:38:50 localhost nova_compute[280512]: virtiofs Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: tpm-tis Feb 23 04:38:50 localhost nova_compute[280512]: tpm-crb Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: emulator Feb 23 04:38:50 localhost nova_compute[280512]: external Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 2.0 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: pty Feb 23 04:38:50 localhost nova_compute[280512]: unix Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: qemu Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: builtin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: default Feb 23 04:38:50 localhost nova_compute[280512]: passt Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: isa Feb 23 04:38:50 localhost nova_compute[280512]: hyperv Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: null Feb 23 04:38:50 localhost nova_compute[280512]: vc Feb 23 04:38:50 localhost nova_compute[280512]: pty Feb 23 04:38:50 localhost nova_compute[280512]: dev Feb 23 04:38:50 localhost nova_compute[280512]: file Feb 23 04:38:50 localhost nova_compute[280512]: pipe Feb 23 04:38:50 localhost nova_compute[280512]: stdio Feb 23 04:38:50 localhost nova_compute[280512]: udp Feb 23 04:38:50 localhost nova_compute[280512]: tcp Feb 23 04:38:50 localhost nova_compute[280512]: unix Feb 23 04:38:50 localhost nova_compute[280512]: qemu-vdagent Feb 23 04:38:50 localhost nova_compute[280512]: dbus Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: relaxed Feb 23 04:38:50 localhost nova_compute[280512]: vapic Feb 23 04:38:50 localhost nova_compute[280512]: spinlocks Feb 23 04:38:50 localhost nova_compute[280512]: vpindex Feb 23 04:38:50 localhost nova_compute[280512]: runtime Feb 23 04:38:50 localhost nova_compute[280512]: synic Feb 23 04:38:50 localhost nova_compute[280512]: stimer Feb 23 04:38:50 localhost nova_compute[280512]: reset Feb 23 04:38:50 localhost nova_compute[280512]: vendor_id Feb 23 04:38:50 localhost nova_compute[280512]: frequencies Feb 23 04:38:50 localhost nova_compute[280512]: reenlightenment Feb 23 04:38:50 localhost nova_compute[280512]: tlbflush Feb 23 04:38:50 localhost nova_compute[280512]: ipi Feb 23 04:38:50 localhost nova_compute[280512]: avic Feb 23 04:38:50 localhost nova_compute[280512]: emsr_bitmap Feb 23 04:38:50 localhost nova_compute[280512]: xmm_input Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 4095 Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Linux KVM Hv Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.127 280526 DEBUG nova.virt.libvirt.volume.mount [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.132 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: /usr/libexec/qemu-kvm Feb 23 04:38:50 localhost nova_compute[280512]: kvm Feb 23 04:38:50 localhost nova_compute[280512]: pc-i440fx-rhel7.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: i686 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: rom Feb 23 04:38:50 localhost nova_compute[280512]: pflash Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: yes Feb 23 04:38:50 localhost nova_compute[280512]: no Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: no Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome Feb 23 04:38:50 localhost nova_compute[280512]: AMD Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 486 Feb 23 04:38:50 localhost nova_compute[280512]: 486-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ClearwaterForest Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ClearwaterForest-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Conroe Feb 23 04:38:50 localhost nova_compute[280512]: Conroe-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-IBPB Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v4 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v5 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Turin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Turin-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v1 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v2 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v6 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v7 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: KnightsMill Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: KnightsMill-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G1-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G2 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G2-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G3 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G3-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G4-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G5-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Penryn Feb 23 04:38:50 localhost nova_compute[280512]: Penryn-v1 Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Westmere Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-v2 Feb 23 04:38:50 localhost nova_compute[280512]: athlon Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: athlon-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: core2duo Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: core2duo-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: coreduo Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: coreduo-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: kvm32 Feb 23 04:38:50 localhost nova_compute[280512]: kvm32-v1 Feb 23 04:38:50 localhost nova_compute[280512]: kvm64 Feb 23 04:38:50 localhost nova_compute[280512]: kvm64-v1 Feb 23 04:38:50 localhost nova_compute[280512]: n270 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: n270-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: pentium Feb 23 04:38:50 localhost nova_compute[280512]: pentium-v1 Feb 23 04:38:50 localhost nova_compute[280512]: pentium2 Feb 23 04:38:50 localhost nova_compute[280512]: pentium2-v1 Feb 23 04:38:50 localhost nova_compute[280512]: pentium3 Feb 23 04:38:50 localhost nova_compute[280512]: pentium3-v1 Feb 23 04:38:50 localhost nova_compute[280512]: phenom Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: phenom-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: qemu32 Feb 23 04:38:50 localhost nova_compute[280512]: qemu32-v1 Feb 23 04:38:50 localhost nova_compute[280512]: qemu64 Feb 23 04:38:50 localhost nova_compute[280512]: qemu64-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: file Feb 23 04:38:50 localhost nova_compute[280512]: anonymous Feb 23 04:38:50 localhost nova_compute[280512]: memfd Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: disk Feb 23 04:38:50 localhost nova_compute[280512]: cdrom Feb 23 04:38:50 localhost nova_compute[280512]: floppy Feb 23 04:38:50 localhost nova_compute[280512]: lun Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ide Feb 23 04:38:50 localhost nova_compute[280512]: fdc Feb 23 04:38:50 localhost nova_compute[280512]: scsi Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: sata Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: virtio-transitional Feb 23 04:38:50 localhost nova_compute[280512]: virtio-non-transitional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: vnc Feb 23 04:38:50 localhost nova_compute[280512]: egl-headless Feb 23 04:38:50 localhost nova_compute[280512]: dbus Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: subsystem Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: default Feb 23 04:38:50 localhost nova_compute[280512]: mandatory Feb 23 04:38:50 localhost nova_compute[280512]: requisite Feb 23 04:38:50 localhost nova_compute[280512]: optional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: pci Feb 23 04:38:50 localhost nova_compute[280512]: scsi Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: virtio-transitional Feb 23 04:38:50 localhost nova_compute[280512]: virtio-non-transitional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: random Feb 23 04:38:50 localhost nova_compute[280512]: egd Feb 23 04:38:50 localhost nova_compute[280512]: builtin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: path Feb 23 04:38:50 localhost nova_compute[280512]: handle Feb 23 04:38:50 localhost nova_compute[280512]: virtiofs Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: tpm-tis Feb 23 04:38:50 localhost nova_compute[280512]: tpm-crb Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: emulator Feb 23 04:38:50 localhost nova_compute[280512]: external Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 2.0 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: pty Feb 23 04:38:50 localhost nova_compute[280512]: unix Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: qemu Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: builtin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: default Feb 23 04:38:50 localhost nova_compute[280512]: passt Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: isa Feb 23 04:38:50 localhost nova_compute[280512]: hyperv Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: null Feb 23 04:38:50 localhost nova_compute[280512]: vc Feb 23 04:38:50 localhost nova_compute[280512]: pty Feb 23 04:38:50 localhost nova_compute[280512]: dev Feb 23 04:38:50 localhost nova_compute[280512]: file Feb 23 04:38:50 localhost nova_compute[280512]: pipe Feb 23 04:38:50 localhost nova_compute[280512]: stdio Feb 23 04:38:50 localhost nova_compute[280512]: udp Feb 23 04:38:50 localhost nova_compute[280512]: tcp Feb 23 04:38:50 localhost nova_compute[280512]: unix Feb 23 04:38:50 localhost nova_compute[280512]: qemu-vdagent Feb 23 04:38:50 localhost nova_compute[280512]: dbus Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: relaxed Feb 23 04:38:50 localhost nova_compute[280512]: vapic Feb 23 04:38:50 localhost nova_compute[280512]: spinlocks Feb 23 04:38:50 localhost nova_compute[280512]: vpindex Feb 23 04:38:50 localhost nova_compute[280512]: runtime Feb 23 04:38:50 localhost nova_compute[280512]: synic Feb 23 04:38:50 localhost nova_compute[280512]: stimer Feb 23 04:38:50 localhost nova_compute[280512]: reset Feb 23 04:38:50 localhost nova_compute[280512]: vendor_id Feb 23 04:38:50 localhost nova_compute[280512]: frequencies Feb 23 04:38:50 localhost nova_compute[280512]: reenlightenment Feb 23 04:38:50 localhost nova_compute[280512]: tlbflush Feb 23 04:38:50 localhost nova_compute[280512]: ipi Feb 23 04:38:50 localhost nova_compute[280512]: avic Feb 23 04:38:50 localhost nova_compute[280512]: emsr_bitmap Feb 23 04:38:50 localhost nova_compute[280512]: xmm_input Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 4095 Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Linux KVM Hv Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.184 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.192 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: /usr/libexec/qemu-kvm Feb 23 04:38:50 localhost nova_compute[280512]: kvm Feb 23 04:38:50 localhost nova_compute[280512]: pc-q35-rhel9.8.0 Feb 23 04:38:50 localhost nova_compute[280512]: x86_64 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: efi Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Feb 23 04:38:50 localhost nova_compute[280512]: /usr/share/edk2/ovmf/OVMF_CODE.fd Feb 23 04:38:50 localhost nova_compute[280512]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Feb 23 04:38:50 localhost nova_compute[280512]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: rom Feb 23 04:38:50 localhost nova_compute[280512]: pflash Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: yes Feb 23 04:38:50 localhost nova_compute[280512]: no Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: yes Feb 23 04:38:50 localhost nova_compute[280512]: no Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome Feb 23 04:38:50 localhost nova_compute[280512]: AMD Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 486 Feb 23 04:38:50 localhost nova_compute[280512]: 486-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ClearwaterForest Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ClearwaterForest-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Conroe Feb 23 04:38:50 localhost nova_compute[280512]: Conroe-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-IBPB Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v4 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v5 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Turin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Turin-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v1 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v2 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v6 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v7 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: KnightsMill Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: KnightsMill-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G1-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G2 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G2-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G3 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G3-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G4-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G5-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Penryn Feb 23 04:38:50 localhost nova_compute[280512]: Penryn-v1 Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Westmere Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-v2 Feb 23 04:38:50 localhost nova_compute[280512]: athlon Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: athlon-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: core2duo Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: core2duo-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: coreduo Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: coreduo-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: kvm32 Feb 23 04:38:50 localhost nova_compute[280512]: kvm32-v1 Feb 23 04:38:50 localhost nova_compute[280512]: kvm64 Feb 23 04:38:50 localhost nova_compute[280512]: kvm64-v1 Feb 23 04:38:50 localhost nova_compute[280512]: n270 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: n270-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: pentium Feb 23 04:38:50 localhost nova_compute[280512]: pentium-v1 Feb 23 04:38:50 localhost nova_compute[280512]: pentium2 Feb 23 04:38:50 localhost nova_compute[280512]: pentium2-v1 Feb 23 04:38:50 localhost nova_compute[280512]: pentium3 Feb 23 04:38:50 localhost nova_compute[280512]: pentium3-v1 Feb 23 04:38:50 localhost nova_compute[280512]: phenom Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: phenom-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: qemu32 Feb 23 04:38:50 localhost nova_compute[280512]: qemu32-v1 Feb 23 04:38:50 localhost nova_compute[280512]: qemu64 Feb 23 04:38:50 localhost nova_compute[280512]: qemu64-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: file Feb 23 04:38:50 localhost nova_compute[280512]: anonymous Feb 23 04:38:50 localhost nova_compute[280512]: memfd Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: disk Feb 23 04:38:50 localhost nova_compute[280512]: cdrom Feb 23 04:38:50 localhost nova_compute[280512]: floppy Feb 23 04:38:50 localhost nova_compute[280512]: lun Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: fdc Feb 23 04:38:50 localhost nova_compute[280512]: scsi Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: sata Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: virtio-transitional Feb 23 04:38:50 localhost nova_compute[280512]: virtio-non-transitional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: vnc Feb 23 04:38:50 localhost nova_compute[280512]: egl-headless Feb 23 04:38:50 localhost nova_compute[280512]: dbus Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: subsystem Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: default Feb 23 04:38:50 localhost nova_compute[280512]: mandatory Feb 23 04:38:50 localhost nova_compute[280512]: requisite Feb 23 04:38:50 localhost nova_compute[280512]: optional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: pci Feb 23 04:38:50 localhost nova_compute[280512]: scsi Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: virtio-transitional Feb 23 04:38:50 localhost nova_compute[280512]: virtio-non-transitional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: random Feb 23 04:38:50 localhost nova_compute[280512]: egd Feb 23 04:38:50 localhost nova_compute[280512]: builtin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: path Feb 23 04:38:50 localhost nova_compute[280512]: handle Feb 23 04:38:50 localhost nova_compute[280512]: virtiofs Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: tpm-tis Feb 23 04:38:50 localhost nova_compute[280512]: tpm-crb Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: emulator Feb 23 04:38:50 localhost nova_compute[280512]: external Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 2.0 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: pty Feb 23 04:38:50 localhost nova_compute[280512]: unix Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: qemu Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: builtin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: default Feb 23 04:38:50 localhost nova_compute[280512]: passt Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: isa Feb 23 04:38:50 localhost nova_compute[280512]: hyperv Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: null Feb 23 04:38:50 localhost nova_compute[280512]: vc Feb 23 04:38:50 localhost nova_compute[280512]: pty Feb 23 04:38:50 localhost nova_compute[280512]: dev Feb 23 04:38:50 localhost nova_compute[280512]: file Feb 23 04:38:50 localhost nova_compute[280512]: pipe Feb 23 04:38:50 localhost nova_compute[280512]: stdio Feb 23 04:38:50 localhost nova_compute[280512]: udp Feb 23 04:38:50 localhost nova_compute[280512]: tcp Feb 23 04:38:50 localhost nova_compute[280512]: unix Feb 23 04:38:50 localhost nova_compute[280512]: qemu-vdagent Feb 23 04:38:50 localhost nova_compute[280512]: dbus Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: relaxed Feb 23 04:38:50 localhost nova_compute[280512]: vapic Feb 23 04:38:50 localhost nova_compute[280512]: spinlocks Feb 23 04:38:50 localhost nova_compute[280512]: vpindex Feb 23 04:38:50 localhost nova_compute[280512]: runtime Feb 23 04:38:50 localhost nova_compute[280512]: synic Feb 23 04:38:50 localhost nova_compute[280512]: stimer Feb 23 04:38:50 localhost nova_compute[280512]: reset Feb 23 04:38:50 localhost nova_compute[280512]: vendor_id Feb 23 04:38:50 localhost nova_compute[280512]: frequencies Feb 23 04:38:50 localhost nova_compute[280512]: reenlightenment Feb 23 04:38:50 localhost nova_compute[280512]: tlbflush Feb 23 04:38:50 localhost nova_compute[280512]: ipi Feb 23 04:38:50 localhost nova_compute[280512]: avic Feb 23 04:38:50 localhost nova_compute[280512]: emsr_bitmap Feb 23 04:38:50 localhost nova_compute[280512]: xmm_input Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 4095 Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Linux KVM Hv Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.246 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: /usr/libexec/qemu-kvm Feb 23 04:38:50 localhost nova_compute[280512]: kvm Feb 23 04:38:50 localhost nova_compute[280512]: pc-i440fx-rhel7.6.0 Feb 23 04:38:50 localhost nova_compute[280512]: x86_64 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: rom Feb 23 04:38:50 localhost nova_compute[280512]: pflash Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: yes Feb 23 04:38:50 localhost nova_compute[280512]: no Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: no Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome Feb 23 04:38:50 localhost nova_compute[280512]: AMD Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 486 Feb 23 04:38:50 localhost nova_compute[280512]: 486-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Broadwell-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cascadelake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ClearwaterForest Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ClearwaterForest-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Conroe Feb 23 04:38:50 localhost nova_compute[280512]: Conroe-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Cooperlake-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Denverton-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Dhyana-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Genoa-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-IBPB Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Milan-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v4 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Rome-v5 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Turin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-Turin-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v1 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v2 Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: EPYC-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: GraniteRapids-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Haswell-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-noTSX Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v6 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Icelake-Server-v7 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: IvyBridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: KnightsMill Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: KnightsMill-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Nehalem-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G1-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G2 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G2-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G3 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G3-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G4-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Opteron_G5-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Penryn Feb 23 04:38:50 localhost nova_compute[280512]: Penryn-v1 Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: SandyBridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SapphireRapids-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: SierraForest-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Client-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-noTSX-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Skylake-Server-v5 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v2 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v3 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Snowridge-v4 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Westmere Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-IBRS Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Westmere-v2 Feb 23 04:38:50 localhost nova_compute[280512]: athlon Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: athlon-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: core2duo Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: core2duo-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: coreduo Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: coreduo-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: kvm32 Feb 23 04:38:50 localhost nova_compute[280512]: kvm32-v1 Feb 23 04:38:50 localhost nova_compute[280512]: kvm64 Feb 23 04:38:50 localhost nova_compute[280512]: kvm64-v1 Feb 23 04:38:50 localhost nova_compute[280512]: n270 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: n270-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: pentium Feb 23 04:38:50 localhost nova_compute[280512]: pentium-v1 Feb 23 04:38:50 localhost nova_compute[280512]: pentium2 Feb 23 04:38:50 localhost nova_compute[280512]: pentium2-v1 Feb 23 04:38:50 localhost nova_compute[280512]: pentium3 Feb 23 04:38:50 localhost nova_compute[280512]: pentium3-v1 Feb 23 04:38:50 localhost nova_compute[280512]: phenom Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: phenom-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: qemu32 Feb 23 04:38:50 localhost nova_compute[280512]: qemu32-v1 Feb 23 04:38:50 localhost nova_compute[280512]: qemu64 Feb 23 04:38:50 localhost nova_compute[280512]: qemu64-v1 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: file Feb 23 04:38:50 localhost nova_compute[280512]: anonymous Feb 23 04:38:50 localhost nova_compute[280512]: memfd Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: disk Feb 23 04:38:50 localhost nova_compute[280512]: cdrom Feb 23 04:38:50 localhost nova_compute[280512]: floppy Feb 23 04:38:50 localhost nova_compute[280512]: lun Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: ide Feb 23 04:38:50 localhost nova_compute[280512]: fdc Feb 23 04:38:50 localhost nova_compute[280512]: scsi Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: sata Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: virtio-transitional Feb 23 04:38:50 localhost nova_compute[280512]: virtio-non-transitional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: vnc Feb 23 04:38:50 localhost nova_compute[280512]: egl-headless Feb 23 04:38:50 localhost nova_compute[280512]: dbus Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: subsystem Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: default Feb 23 04:38:50 localhost nova_compute[280512]: mandatory Feb 23 04:38:50 localhost nova_compute[280512]: requisite Feb 23 04:38:50 localhost nova_compute[280512]: optional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: pci Feb 23 04:38:50 localhost nova_compute[280512]: scsi Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: virtio Feb 23 04:38:50 localhost nova_compute[280512]: virtio-transitional Feb 23 04:38:50 localhost nova_compute[280512]: virtio-non-transitional Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: random Feb 23 04:38:50 localhost nova_compute[280512]: egd Feb 23 04:38:50 localhost nova_compute[280512]: builtin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: path Feb 23 04:38:50 localhost nova_compute[280512]: handle Feb 23 04:38:50 localhost nova_compute[280512]: virtiofs Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: tpm-tis Feb 23 04:38:50 localhost nova_compute[280512]: tpm-crb Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: emulator Feb 23 04:38:50 localhost nova_compute[280512]: external Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 2.0 Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: usb Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: pty Feb 23 04:38:50 localhost nova_compute[280512]: unix Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: qemu Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: builtin Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: default Feb 23 04:38:50 localhost nova_compute[280512]: passt Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: isa Feb 23 04:38:50 localhost nova_compute[280512]: hyperv Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: null Feb 23 04:38:50 localhost nova_compute[280512]: vc Feb 23 04:38:50 localhost nova_compute[280512]: pty Feb 23 04:38:50 localhost nova_compute[280512]: dev Feb 23 04:38:50 localhost nova_compute[280512]: file Feb 23 04:38:50 localhost nova_compute[280512]: pipe Feb 23 04:38:50 localhost nova_compute[280512]: stdio Feb 23 04:38:50 localhost nova_compute[280512]: udp Feb 23 04:38:50 localhost nova_compute[280512]: tcp Feb 23 04:38:50 localhost nova_compute[280512]: unix Feb 23 04:38:50 localhost nova_compute[280512]: qemu-vdagent Feb 23 04:38:50 localhost nova_compute[280512]: dbus Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: relaxed Feb 23 04:38:50 localhost nova_compute[280512]: vapic Feb 23 04:38:50 localhost nova_compute[280512]: spinlocks Feb 23 04:38:50 localhost nova_compute[280512]: vpindex Feb 23 04:38:50 localhost nova_compute[280512]: runtime Feb 23 04:38:50 localhost nova_compute[280512]: synic Feb 23 04:38:50 localhost nova_compute[280512]: stimer Feb 23 04:38:50 localhost nova_compute[280512]: reset Feb 23 04:38:50 localhost nova_compute[280512]: vendor_id Feb 23 04:38:50 localhost nova_compute[280512]: frequencies Feb 23 04:38:50 localhost nova_compute[280512]: reenlightenment Feb 23 04:38:50 localhost nova_compute[280512]: tlbflush Feb 23 04:38:50 localhost nova_compute[280512]: ipi Feb 23 04:38:50 localhost nova_compute[280512]: avic Feb 23 04:38:50 localhost nova_compute[280512]: emsr_bitmap Feb 23 04:38:50 localhost nova_compute[280512]: xmm_input Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: 4095 Feb 23 04:38:50 localhost nova_compute[280512]: on Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: off Feb 23 04:38:50 localhost nova_compute[280512]: Linux KVM Hv Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: Feb 23 04:38:50 localhost nova_compute[280512]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.297 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.298 280526 INFO nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Secure Boot support detected#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.301 280526 INFO nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.302 280526 INFO nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.317 280526 DEBUG nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.343 280526 INFO nova.virt.node [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.357 280526 DEBUG nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Verified node be63d86c-a403-4ec9-a515-07ea2962cb4d matches my host np0005626463.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.387 280526 DEBUG nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.392 280526 DEBUG nova.virt.libvirt.vif [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-23T08:22:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005626463.localdomain',hostname='test',id=3,image_ref='a9204248-210d-45b5-ab0a-d1ec08a73a4f',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2026-02-23T08:23:11Z,launched_on='np0005626463.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005626463.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='37b8098efb0d4ecc90b451a2db0e966f',ramdisk_id='',reservation_id='r-90tij075',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2026-02-23T08:23:11Z,user_data=None,user_id='cb6895487918456aa599ca2f76872d00',uuid=c2a7d92b-952f-46a7-8a6a-3322a48fcf4b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.392 280526 DEBUG nova.network.os_vif_util [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Converting VIF {"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.393 280526 DEBUG nova.network.os_vif_util [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.394 280526 DEBUG os_vif [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.489 280526 DEBUG ovsdbapp.backend.ovs_idl [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.490 280526 DEBUG ovsdbapp.backend.ovs_idl [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.490 280526 DEBUG ovsdbapp.backend.ovs_idl [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.490 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.491 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.491 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.491 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.493 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.496 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.510 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.511 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.511 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 23 04:38:50 localhost nova_compute[280512]: 2026-02-23 09:38:50.512 280526 INFO oslo.privsep.daemon [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpc7cdv_y2/privsep.sock']#033[00m Feb 23 04:38:50 localhost python3.9[280990]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771839530.1622736-3312-130469988177729/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.189 280526 INFO oslo.privsep.daemon [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.063 280993 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.068 280993 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.071 280993 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.072 280993 INFO oslo.privsep.daemon [-] privsep daemon running as pid 280993#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.441 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.442 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa27e5011-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.443 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa27e5011-20, col_values=(('external_ids', {'iface-id': 'a27e5011-2016-4b16-b5e8-04b555b30bc4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:9d:00', 'vm-uuid': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.444 280526 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.445 280526 INFO os_vif [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:9d:00,bridge_name='br-int',has_traffic_filtering=True,id=a27e5011-2016-4b16-b5e8-04b555b30bc4,network=Network(9da5b53d-3184-450f-9a5b-bdba1a6c9f6d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa27e5011-20')#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.445 280526 DEBUG nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.450 280526 DEBUG nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.450 280526 INFO nova.compute.manager [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.525 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.525 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.526 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.526 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.527 280526 DEBUG oslo_concurrency.processutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:38:51 localhost python3.9[281051]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:38:51 localhost nova_compute[280512]: 2026-02-23 09:38:51.709 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:51 localhost sshd[281091]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.027 280526 DEBUG oslo_concurrency.processutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.091 280526 DEBUG nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.091 280526 DEBUG nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.308 280526 WARNING nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.310 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12192MB free_disk=41.83688735961914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.310 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.311 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.448 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.448 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.449 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.496 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.517 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.517 280526 DEBUG nova.compute.provider_tree [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.535 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.556 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_MMX,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE41,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AMD_SVM,HW_CPU_X86_SVM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_FMA3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AESNI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,HW_CPU_X86_AVX,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000E _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 04:38:52 localhost nova_compute[280512]: 2026-02-23 09:38:52.592 280526 DEBUG oslo_concurrency.processutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.053 280526 DEBUG oslo_concurrency.processutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.059 280526 DEBUG nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Feb 23 04:38:53 localhost nova_compute[280512]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.059 280526 INFO nova.virt.libvirt.host [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] kernel doesn't support AMD SEV#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.061 280526 DEBUG nova.compute.provider_tree [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.062 280526 DEBUG nova.virt.libvirt.driver [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.084 280526 DEBUG nova.scheduler.client.report [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.111 280526 DEBUG nova.compute.resource_tracker [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.112 280526 DEBUG oslo_concurrency.lockutils [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.112 280526 DEBUG nova.service [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.139 280526 DEBUG nova.service [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Feb 23 04:38:53 localhost nova_compute[280512]: 2026-02-23 09:38:53.140 280526 DEBUG nova.servicegroup.drivers.db [None req-b8927e34-219e-43f3-a84d-184de0519090 - - - - - -] DB_Driver: join new ServiceGroup member np0005626463.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Feb 23 04:38:53 localhost python3.9[281207]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Feb 23 04:38:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57420 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD120F0000000001030307) Feb 23 04:38:54 localhost python3.9[281317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Feb 23 04:38:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57421 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD16060000000001030307) Feb 23 04:38:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:38:55 localhost podman[281408]: 2026-02-23 09:38:55.260479466 +0000 UTC m=+0.087268795 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:38:55 localhost podman[281408]: 2026-02-23 09:38:55.274281465 +0000 UTC m=+0.101070784 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:38:55 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:38:55 localhost python3.9[281407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771839534.3600266-3435-280990999344300/.source.yaml _original_basename=.h_bgzpl6 follow=False checksum=4185f12b535f7417c8eab31aeeb8094a78600762 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:38:55 localhost nova_compute[280512]: 2026-02-23 09:38:55.522 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.132 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:38:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23723 DF PROTO=TCP SPT=54442 DPT=9102 SEQ=1737972057 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD1A060000000001030307) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.137 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efddab9c-4c17-4c4f-ab69-f617ce278b85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 145, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.133811', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79027796-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': 'b0ab021e1ec2ae764e04f5b1bca6022742a3a3043f6f0e3d6efcba988dff7345'}]}, 'timestamp': '2026-02-23 09:38:56.137973', '_unique_id': 'e894ffb1be2f4e9ba8e42b66eed85c22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.139 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.140 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.141 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 9662 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '511ec253-929e-4ce5-904b-34c38c37a49f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9662, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.141189', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79030c6a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '06e696cec07968443c4857ce6128b3f5a04b0bed4103b1c9a9abcb57601add4f'}]}, 'timestamp': '2026-02-23 09:38:56.141664', '_unique_id': '9f36e6ca9e494ff7be73a879419784ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.142 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.180 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.180 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39f01e91-43c5-4201-9c4d-d46da94bf8f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.143906', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '790906f6-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '1ccce793b329504a41ab8727b527d66253fb2b574fe5f601f501e992401ac678'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.143906', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '79091c72-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'a75fb76b25d0a86c472073f6f45131f0b5292310728e052c0f1c4b44bcc70bae'}]}, 'timestamp': '2026-02-23 09:38:56.181377', '_unique_id': '4d5b2d7dee174fa7b919036c42ab0d75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.182 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.184 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fab113f-9e64-4927-b6e3-a5a3d737365d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.184153', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79099b3e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '948efb18233c794d576f5092b1665ccfc4aa047bc8b01cfffb953334892d7424'}]}, 'timestamp': '2026-02-23 09:38:56.184641', '_unique_id': '8426793022e34c56983764bb19535bb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.185 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.187 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1234377028 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.187 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 170393160 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7854faaa-c32c-4e0c-aa87-e77b7857bafb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1234377028, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.187330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '790a1712-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'af8e89a001952766683a83fc4089d4862a8d9a83869be6a0bede107a08813c6d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 170393160, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.187330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '790a2928-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'e55ef8a1302a6f9f24650bcc423e4677bcaf0300f9bbc79937c834a4e9f5046f'}]}, 'timestamp': '2026-02-23 09:38:56.188299', '_unique_id': 'a463e0e5eba145a7b9aa526ef461a043'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.189 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.190 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.190 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 577 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.191 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd2bc3ca-2d42-4af9-a72f-af0bd76f1dc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 577, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.190793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '790aa006-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '02711a052d2aaf8b14df220a3c54b4741987efdf009bbbfe51e73c30516b51f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.190793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '790ab1e0-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '40bdea69a80c696e4dfa96f09aaf886a823048681014f89a5e815ae598eeb622'}]}, 'timestamp': '2026-02-23 09:38:56.191757', '_unique_id': '4837a049601948918b5b5b887ca2ecfb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.208 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74a0e0cf-a670-4a46-bd95-0365a98f4839', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.195399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '790d40a4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': 'cad3c6c13edd14ac17e3a6506c617f32fea3dbb36bb5c6a08ee80ecf75221976'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.195399', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '790d522e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': 'adbe40316c205c6403f06f28a6eb6cb1e50c1d8711d8edc4e98ba27b8f5e6700'}]}, 'timestamp': '2026-02-23 09:38:56.208999', '_unique_id': '42399bf38b8648d29bd1b5d66a39345b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.211 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 57690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '695c3b76-de03-40d9-b246-f6c41322010a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57690000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:38:56.211345', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7910d39a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.420805905, 'message_signature': 'c136ca2c6a36d7adef824c7f10e3988a9a8abb0a15bfa801dc8ae622e687d836'}]}, 'timestamp': '2026-02-23 09:38:56.232021', '_unique_id': '6ca4e688f18946b3affbe370d76f8262'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1d7ff8a-a343-4a7a-be59-0b01ff48b764', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.235021', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '7911847a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': 'e5c6ba4e0d5fb81403b1b56d7d9fcde550c420e5048103c40164807489d37bb1'}]}, 'timestamp': '2026-02-23 09:38:56.236610', '_unique_id': '90f957af34734f3ca30cbdff386f971f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65d0621f-56ed-42b1-9db8-c6e71de1a74f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.239766', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79121980-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '0c272eff581eaf41d48e29784df155e6c9eb0c3fd18432f270591c6b57f2784a'}]}, 'timestamp': '2026-02-23 09:38:56.240308', '_unique_id': 'cf9ce7374e094c59b3f4a5dbc33ef5b7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2233a0c2-36b4-4d7c-b402-2687ec1b21d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.242509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '79128230-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': '5a4b9b91d704146817fb0c7b1676a86edf6611f8cd78ae747d607eca4b40e5a6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.242509', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7912954a-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': '2b60673eedf69fea8483cc521278372f3c586b6c1b68749b73e19639c607b3e7'}]}, 'timestamp': '2026-02-23 09:38:56.243442', '_unique_id': 'eaf746eaed5243dfb2f9bb6cb5e02503'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 260974500 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 24478467 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8689e10c-d450-4b10-94b8-c646d0993c5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 260974500, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.245674', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7912feb8-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'f8b49c146d1fd2c729a2bd95fd5fbeea0a7c29b56890f21dac8e9d484826264a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24478467, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.245674', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '79130f52-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '94bb849f413b01337f2209bd83c27e86438c20308f07b256c2bd4fb16a0c76eb'}]}, 'timestamp': '2026-02-23 09:38:56.246560', '_unique_id': '9990de3a6744486a82e3db31f7c3b4c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 12784 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5484fe47-9f26-4214-a341-9005e0d5158e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12784, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.248784', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '7913788e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '7becb80498e0b68928d1cf0cafa8f5af23c555f59e18ccadc7914f2fe13a6859'}]}, 'timestamp': '2026-02-23 09:38:56.249284', '_unique_id': 'd38b7029a3144e34937cd9bf5572dc58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 92 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e30d0d94-a1a8-44a7-be3f-7c32a7158d7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 92, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.251592', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '7913e486-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '4f093aaa89a051055c1b9a6b4e557a334db0ddb1c5dd6251acd3e93682bb3ee1'}]}, 'timestamp': '2026-02-23 09:38:56.252080', '_unique_id': '1d8c888f00c34cf596fe80d8fdee4293'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9508448-c634-49a6-94f5-70589cf47ac2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.254221', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '79144b56-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': '1b60d3f9b78f7701330298ac19b6dd51461514c9b6e26de11b0444eceb5b148a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.254221', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '79145d4e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.384902945, 'message_signature': '1a8bd01dcc55e9e2f2ce3c5c7b4801d870a47b83c8a5dae028be78fb65cfba3b'}]}, 'timestamp': '2026-02-23 09:38:56.255116', '_unique_id': '9e1ff4cdcb00447e97499b5cdfa9a529'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '752d258c-5281-4ae1-98c0-e96dffda710e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.257296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7914c388-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '3068665d16d6e88f332638fc2b4b18d356dc75d809d4b05b1780cc1fdd792c7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.257296', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7914d512-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'd54e0e8ab4fc501daf60de18fad0c8e998dda6d5f27996051595f54472427a97'}]}, 'timestamp': '2026-02-23 09:38:56.258236', '_unique_id': '53ad012b92af4fc2ac818ba34506a439'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.260 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.260 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 52.38671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c51393ce-7992-42fb-9fd3-6a007e64a3d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.38671875, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:38:56.260447', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '79153ebc-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.420805905, 'message_signature': '9f4879bccf9264921efee80beb7361a94e05365865bcd2275fc25029cf625cba'}]}, 'timestamp': '2026-02-23 09:38:56.260928', '_unique_id': 'f599fd60021947568f57e6987525903b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 74063872 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b987e48-1684-44d2-b72c-e418c1da0899', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 74063872, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:38:56.263081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7915a56e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': 'bb970f494c9a2a7a31bc793ba8906e76f291ceef48fa5a6e969710b3f6096526'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:38:56.263081', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7915b5c2-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.333390578, 'message_signature': '7edb54985b6773a651d6cc82831ff7cb504cc7b9cbe5429c75abe3f0de005198'}]}, 'timestamp': '2026-02-23 09:38:56.263959', '_unique_id': 'b2421ff1a75e4af0b367812bc800200f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97cdaff5-a1bb-4cd5-a433-867f9037cc4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.266142', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '79161d32-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': '5bb0ed5ffa5be6b83519ce0d4966e2fc40a329dd35939af26025fb6369a4e7ba'}]}, 'timestamp': '2026-02-23 09:38:56.266603', '_unique_id': '806a0785a80c40749c1f20b8b4f4e3dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.268 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2f43810-d966-4c22-b6c3-5cd450d38bc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.268766', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '791684d4-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': 'ceefa987dfa5af62cfe815cb3803c9daa1846de97fd73d339262879aa00ad697'}]}, 'timestamp': '2026-02-23 09:38:56.269260', '_unique_id': '0201d35d6b234b4fa09920c54f77e86c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.270 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.271 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db0b8c95-3d98-473c-9966-332d09277754', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:38:56.271466', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '7916ed3e-109b-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 10976.323322544, 'message_signature': 'd45b069e669a9a710c9a53f778c68b0f9bb35ece266d55b9373dfeff239ce2df'}]}, 'timestamp': '2026-02-23 09:38:56.271965', '_unique_id': '99f6959aa9924bf6899ac337dbd9e522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:38:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:38:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 04:38:56 localhost python3.9[281537]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:38:56 localhost nova_compute[280512]: 2026-02-23 09:38:56.734 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:38:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57422 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD1E070000000001030307) Feb 23 04:38:57 localhost python3.9[281645]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:38:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64124 DF PROTO=TCP SPT=56114 DPT=9102 SEQ=276046667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD22060000000001030307) Feb 23 04:38:58 localhost python3.9[281753]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Feb 23 04:39:00 localhost python3.9[281863]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 23 04:39:00 localhost systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 120.4 (401 of 333 items), suggesting rotation. Feb 23 04:39:00 localhost systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 04:39:00 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:39:00 localhost nova_compute[280512]: 2026-02-23 09:39:00.566 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:39:00 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:39:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57423 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD2DC60000000001030307) Feb 23 04:39:01 localhost python3.9[281998]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Feb 23 04:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:39:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 5152 writes, 23K keys, 5152 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5152 writes, 679 syncs, 7.59 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 04:39:01 localhost nova_compute[280512]: 2026-02-23 09:39:01.787 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:39:02 localhost systemd[1]: Stopping nova_compute container... Feb 23 04:39:02 localhost nova_compute[280512]: 2026-02-23 09:39:02.492 280526 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Feb 23 04:39:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:39:03 localhost podman[282034]: 2026-02-23 09:39:03.715652203 +0000 UTC m=+0.088817382 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, release=1770267347, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.7) Feb 23 04:39:03 localhost podman[282034]: 2026-02-23 09:39:03.7290309 +0000 UTC m=+0.102196089 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347) Feb 23 04:39:03 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:39:05 localhost nova_compute[280512]: 2026-02-23 09:39:05.603 280526 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:39:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 5421 writes, 24K keys, 5421 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5421 writes, 705 syncs, 7.69 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 04:39:06 localhost nova_compute[280512]: 2026-02-23 09:39:06.371 280526 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Feb 23 04:39:06 localhost nova_compute[280512]: 2026-02-23 09:39:06.373 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:39:06 localhost nova_compute[280512]: 2026-02-23 09:39:06.374 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:39:06 localhost nova_compute[280512]: 2026-02-23 09:39:06.374 280526 DEBUG oslo_concurrency.lockutils [None req-549dbd64-97a3-411d-aa4c-c3ed66a64f85 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:39:06 localhost journal[207530]: End of file while reading data: Input/output error Feb 23 04:39:06 localhost systemd[1]: libpod-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b.scope: Deactivated successfully. Feb 23 04:39:06 localhost systemd[1]: libpod-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b.scope: Consumed 4.815s CPU time. Feb 23 04:39:06 localhost podman[282002]: 2026-02-23 09:39:06.801732229 +0000 UTC m=+4.378312432 container died 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:39:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b-userdata-shm.mount: Deactivated successfully. Feb 23 04:39:06 localhost podman[282002]: 2026-02-23 09:39:06.872506068 +0000 UTC m=+4.449086281 container cleanup 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:39:06 localhost podman[282002]: nova_compute Feb 23 04:39:06 localhost podman[282160]: 2026-02-23 09:39:06.886616038 +0000 UTC m=+0.078509050 container cleanup 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:39:06 localhost systemd[1]: libpod-conmon-2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b.scope: Deactivated successfully. Feb 23 04:39:06 localhost podman[282188]: error opening file `/run/crun/2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b/status`: No such file or directory Feb 23 04:39:06 localhost podman[282177]: 2026-02-23 09:39:06.988815156 +0000 UTC m=+0.067323181 container cleanup 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 23 04:39:06 localhost podman[282177]: nova_compute Feb 23 04:39:06 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Feb 23 04:39:06 localhost systemd[1]: Stopped nova_compute container. Feb 23 04:39:07 localhost systemd[1]: Starting nova_compute container... Feb 23 04:39:07 localhost systemd[1]: Started libcrun container. Feb 23 04:39:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Feb 23 04:39:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Feb 23 04:39:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Feb 23 04:39:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 04:39:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f24e11ebd9f7a5cf882dc695a3004cbcd963f9ee45c416d30f0f099030bac26/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Feb 23 04:39:07 localhost podman[282192]: 2026-02-23 09:39:07.148760186 +0000 UTC m=+0.119400696 container init 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:39:07 localhost podman[282192]: 2026-02-23 09:39:07.157492218 +0000 UTC m=+0.128132728 container start 2129e353a5d4171f586ce5c762891941b44f7611471f84d84a42a19d9df53b3b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-9420ebc64adcbe31526e40e2b0b78b2e1b7d41ddf41dd4e3243f19db7c97ac09-3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:39:07 localhost podman[282192]: nova_compute Feb 23 04:39:07 localhost nova_compute[282206]: + sudo -E kolla_set_configs Feb 23 04:39:07 localhost systemd[1]: Started nova_compute container. Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Validating config file Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying service configuration files Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /etc/ceph Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Creating directory /etc/ceph Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Deleting /usr/sbin/iscsiadm Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Writing out command to execute Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:39:07 localhost nova_compute[282206]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Feb 23 04:39:07 localhost nova_compute[282206]: ++ cat /run_command Feb 23 04:39:07 localhost nova_compute[282206]: + CMD=nova-compute Feb 23 04:39:07 localhost nova_compute[282206]: + ARGS= Feb 23 04:39:07 localhost nova_compute[282206]: + sudo kolla_copy_cacerts Feb 23 04:39:07 localhost nova_compute[282206]: + [[ ! -n '' ]] Feb 23 04:39:07 localhost nova_compute[282206]: + . kolla_extend_start Feb 23 04:39:07 localhost nova_compute[282206]: Running command: 'nova-compute' Feb 23 04:39:07 localhost nova_compute[282206]: + echo 'Running command: '\''nova-compute'\''' Feb 23 04:39:07 localhost nova_compute[282206]: + umask 0022 Feb 23 04:39:07 localhost nova_compute[282206]: + exec nova-compute Feb 23 04:39:07 localhost python3.9[282328]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Feb 23 04:39:08 localhost systemd[1]: Started libpod-conmon-0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e.scope. Feb 23 04:39:08 localhost systemd[1]: Started libcrun container. Feb 23 04:39:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57f867c5b8445456b5e18d0ba5d2cc6c62032606bf7ac5da73f5ad9f9752ac4/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Feb 23 04:39:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57f867c5b8445456b5e18d0ba5d2cc6c62032606bf7ac5da73f5ad9f9752ac4/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Feb 23 04:39:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a57f867c5b8445456b5e18d0ba5d2cc6c62032606bf7ac5da73f5ad9f9752ac4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Feb 23 04:39:08 localhost podman[282353]: 2026-02-23 09:39:08.227142658 +0000 UTC m=+0.144002243 container init 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=nova_compute_init, io.buildah.version=1.43.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:39:08 localhost podman[282353]: 2026-02-23 09:39:08.238835693 +0000 UTC m=+0.155695278 container start 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.43.0, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:39:08 localhost python3.9[282328]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Applying nova statedir ownership Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/ Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b already 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b to system_u:object_r:container_file_t:s0 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/console.log Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/b81db1e2a8e54083d8c4b030cc59287a706969ae Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-b81db1e2a8e54083d8c4b030cc59287a706969ae Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9 Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/f23138a46bc477ec40b895db4322b27384fbb01ccd8da7395c9877132dfb82af Feb 23 04:39:08 localhost nova_compute_init[282391]: INFO:nova_statedir:Nova statedir ownership complete Feb 23 04:39:08 localhost systemd[1]: libpod-0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e.scope: Deactivated successfully. Feb 23 04:39:08 localhost podman[282392]: 2026-02-23 09:39:08.312463711 +0000 UTC m=+0.054979057 container died 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:39:08 localhost podman[282404]: 2026-02-23 09:39:08.391367122 +0000 UTC m=+0.075510207 container cleanup 0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '3424c3acd670eb930ca235b189e22cc85d2902412aa65355a03bde52550f3369'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 04:39:08 localhost systemd[1]: libpod-conmon-0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e.scope: Deactivated successfully. Feb 23 04:39:08 localhost systemd[1]: var-lib-containers-storage-overlay-a57f867c5b8445456b5e18d0ba5d2cc6c62032606bf7ac5da73f5ad9f9752ac4-merged.mount: Deactivated successfully. Feb 23 04:39:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0dcfdba09d286eb3abcbc0c0350212b6da0b7ea5d4e3ec4a127be329988c054e-userdata-shm.mount: Deactivated successfully. Feb 23 04:39:08 localhost nova_compute[282206]: 2026-02-23 09:39:08.877 282211 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:39:08 localhost nova_compute[282206]: 2026-02-23 09:39:08.878 282211 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:39:08 localhost nova_compute[282206]: 2026-02-23 09:39:08.878 282211 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Feb 23 04:39:08 localhost nova_compute[282206]: 2026-02-23 09:39:08.879 282211 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Feb 23 04:39:08 localhost systemd[1]: session-59.scope: Deactivated successfully. Feb 23 04:39:08 localhost systemd[1]: session-59.scope: Consumed 1min 22.025s CPU time. Feb 23 04:39:08 localhost systemd-logind[759]: Session 59 logged out. Waiting for processes to exit. Feb 23 04:39:08 localhost systemd-logind[759]: Removed session 59. Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.024 282211 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.046 282211 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.046 282211 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Feb 23 04:39:09 localhost podman[242954]: time="2026-02-23T09:39:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:39:09 localhost podman[242954]: @ - - [23/Feb/2026:09:39:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1" Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.429 282211 INFO nova.virt.driver [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Feb 23 04:39:09 localhost podman[242954]: @ - - [23/Feb/2026:09:39:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16786 "" "Go-http-client/1.1" Feb 23 04:39:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57424 DF PROTO=TCP SPT=47026 DPT=9102 SEQ=1491910498 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFD4E060000000001030307) Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.541 282211 INFO nova.compute.provider_config [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.549 282211 DEBUG oslo_concurrency.lockutils [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.549 282211 DEBUG oslo_concurrency.lockutils [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.549 282211 DEBUG oslo_concurrency.lockutils [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.550 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.551 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.552 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] console_host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.553 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.554 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] host = np0005626463.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.555 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.556 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.557 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.558 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.559 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.560 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.561 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.562 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.563 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.564 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.565 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.566 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.567 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.568 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.569 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.570 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.571 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.572 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.573 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.574 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.575 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.576 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.577 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.578 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.579 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.580 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.os_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.581 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.582 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.583 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.584 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.585 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.586 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.587 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.588 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.589 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.590 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.591 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.592 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.593 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.594 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.595 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.596 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.597 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.598 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.599 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.600 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.601 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.602 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.603 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.604 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.barbican_region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.605 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.606 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.607 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.608 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.609 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.610 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.611 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.612 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.613 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.614 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.615 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.616 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.617 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.618 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.619 282211 WARNING oslo_config.cfg [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Feb 23 04:39:09 localhost nova_compute[282206]: live_migration_uri is deprecated for removal in favor of two other options that Feb 23 04:39:09 localhost nova_compute[282206]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Feb 23 04:39:09 localhost nova_compute[282206]: and ``live_migration_inbound_addr`` respectively. Feb 23 04:39:09 localhost nova_compute[282206]: ). Its value may be silently ignored in the future.#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.619 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.620 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.621 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_secret_uuid = f1fea371-cb69-578d-a3d0-b5c472a84b46 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.622 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.623 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.624 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.625 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.625 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.625 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.626 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.626 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.627 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.627 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.627 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.628 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.628 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.628 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.629 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.629 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.629 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.630 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.630 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.630 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.630 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.631 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.631 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.631 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.632 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.632 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.632 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.633 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.633 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.633 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.634 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.634 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.634 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.635 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.635 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.635 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.636 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.636 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.636 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.637 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.637 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.637 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.638 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.638 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.638 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.639 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.639 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.639 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.640 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.640 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.640 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.641 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.641 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.641 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.641 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.642 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.642 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.642 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.643 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.643 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.644 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.644 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.644 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.644 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.645 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.645 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.645 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.646 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.646 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.647 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.647 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.647 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.648 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.648 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.648 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.649 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.649 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.649 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.650 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.650 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.650 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.651 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.651 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.651 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.652 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.652 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.652 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.653 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.653 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.653 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.654 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.654 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.654 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.655 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.655 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.655 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.656 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.656 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.656 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.656 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.657 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.657 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.657 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.658 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.658 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.658 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.659 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.659 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.659 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.660 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.660 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.660 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.661 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.661 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.662 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.662 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.663 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.663 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.663 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.664 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.664 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.664 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.665 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.665 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.665 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.666 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.666 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.666 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.667 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.667 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.667 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.668 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.668 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.668 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.669 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.669 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.669 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.670 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.670 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.670 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.671 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.671 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.672 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.672 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.672 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.673 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.673 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.673 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.674 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.674 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.674 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.675 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.675 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.675 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.676 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.676 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.676 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.676 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.677 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.678 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.679 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.680 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.681 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.682 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.682 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.682 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.682 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.683 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.683 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.683 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.683 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.684 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.685 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.686 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.686 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.686 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.686 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.687 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.688 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.689 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.690 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.691 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.691 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.691 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.691 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.692 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.693 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.694 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.695 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.696 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.697 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.698 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.699 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.700 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.701 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.702 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.703 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.704 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.705 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.705 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.705 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.705 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.706 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.706 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.706 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.706 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.707 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.708 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.709 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.709 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.709 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.709 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.710 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.711 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.712 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.713 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.714 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.715 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.716 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.717 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.718 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.719 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.720 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.721 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.722 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.722 282211 DEBUG oslo_service.service [None req-0600316d-e65c-4aa8-a9ef-943515b8002e - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.722 282211 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.749 282211 INFO nova.virt.node [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Determined node identity be63d86c-a403-4ec9-a515-07ea2962cb4d from /var/lib/nova/compute_id#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.750 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.751 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.752 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.752 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.763 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.766 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.768 282211 INFO nova.virt.libvirt.driver [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Connection event '1' reason 'None'#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.771 282211 INFO nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Libvirt host capabilities Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: bdcaa433-cfc7-450a-99ab-f0985ab59447 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: x86_64 Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Rome-v4 Feb 23 04:39:09 localhost nova_compute[282206]: AMD Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: tcp Feb 23 04:39:09 localhost nova_compute[282206]: rdma Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: 16116612 Feb 23 04:39:09 localhost nova_compute[282206]: 4029153 Feb 23 04:39:09 localhost nova_compute[282206]: 0 Feb 23 04:39:09 localhost nova_compute[282206]: 0 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: selinux Feb 23 04:39:09 localhost nova_compute[282206]: 0 Feb 23 04:39:09 localhost nova_compute[282206]: system_u:system_r:svirt_t:s0 Feb 23 04:39:09 localhost nova_compute[282206]: system_u:system_r:svirt_tcg_t:s0 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: dac Feb 23 04:39:09 localhost nova_compute[282206]: 0 Feb 23 04:39:09 localhost nova_compute[282206]: +107:+107 Feb 23 04:39:09 localhost nova_compute[282206]: +107:+107 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: hvm Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: 32 Feb 23 04:39:09 localhost nova_compute[282206]: /usr/libexec/qemu-kvm Feb 23 04:39:09 localhost nova_compute[282206]: pc-i440fx-rhel7.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.8.0 Feb 23 04:39:09 localhost nova_compute[282206]: q35 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.4.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.5.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.3.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel7.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.4.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.2.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.2.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.0.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.0.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.1.0 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: hvm Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: 64 Feb 23 04:39:09 localhost nova_compute[282206]: /usr/libexec/qemu-kvm Feb 23 04:39:09 localhost nova_compute[282206]: pc-i440fx-rhel7.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.8.0 Feb 23 04:39:09 localhost nova_compute[282206]: q35 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.4.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.5.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.3.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel7.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.4.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.2.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.2.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel9.0.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.0.0 Feb 23 04:39:09 localhost nova_compute[282206]: pc-q35-rhel8.1.0 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: #033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.783 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.786 282211 DEBUG nova.virt.libvirt.volume.mount [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Feb 23 04:39:09 localhost nova_compute[282206]: 2026-02-23 09:39:09.788 282211 DEBUG nova.virt.libvirt.host [None req-a3e77674-ebee-4da0-8221-dce4e3ab889b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: /usr/libexec/qemu-kvm Feb 23 04:39:09 localhost nova_compute[282206]: kvm Feb 23 04:39:09 localhost nova_compute[282206]: pc-i440fx-rhel7.6.0 Feb 23 04:39:09 localhost nova_compute[282206]: i686 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: /usr/share/OVMF/OVMF_CODE.secboot.fd Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: rom Feb 23 04:39:09 localhost nova_compute[282206]: pflash Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: yes Feb 23 04:39:09 localhost nova_compute[282206]: no Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: no Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: on Feb 23 04:39:09 localhost nova_compute[282206]: off Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: on Feb 23 04:39:09 localhost nova_compute[282206]: off Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Rome Feb 23 04:39:09 localhost nova_compute[282206]: AMD Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: 486 Feb 23 04:39:09 localhost nova_compute[282206]: 486-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Broadwell Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Broadwell-IBRS Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Broadwell-noTSX Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Broadwell-noTSX-IBRS Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Broadwell-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Broadwell-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Broadwell-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Broadwell-v4 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cascadelake-Server Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cascadelake-Server-noTSX Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cascadelake-Server-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cascadelake-Server-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cascadelake-Server-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cascadelake-Server-v4 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cascadelake-Server-v5 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: ClearwaterForest Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: ClearwaterForest-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Conroe Feb 23 04:39:09 localhost nova_compute[282206]: Conroe-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Cooperlake Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cooperlake-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Cooperlake-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Denverton Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Denverton-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Denverton-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Denverton-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Dhyana Feb 23 04:39:09 localhost nova_compute[282206]: Dhyana-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Dhyana-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Genoa Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Genoa-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Genoa-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-IBPB Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Milan Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Milan-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Milan-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Milan-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Rome Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Rome-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Rome-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Rome-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Rome-v4 Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Rome-v5 Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Turin Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-Turin-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-v1 Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-v2 Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-v4 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: EPYC-v5 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: GraniteRapids Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: GraniteRapids-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: GraniteRapids-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: GraniteRapids-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Haswell Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Haswell-IBRS Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Haswell-noTSX Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Haswell-noTSX-IBRS Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Haswell-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Haswell-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Haswell-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Haswell-v4 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server-noTSX Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server-v4 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server-v5 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server-v6 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Icelake-Server-v7 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: IvyBridge Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: IvyBridge-IBRS Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: IvyBridge-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: IvyBridge-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: KnightsMill Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: KnightsMill-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Nehalem Feb 23 04:39:09 localhost nova_compute[282206]: Nehalem-IBRS Feb 23 04:39:09 localhost nova_compute[282206]: Nehalem-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Nehalem-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G1 Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G1-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G2 Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G2-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G3 Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G3-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G4 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G4-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G5 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Opteron_G5-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Penryn Feb 23 04:39:09 localhost nova_compute[282206]: Penryn-v1 Feb 23 04:39:09 localhost nova_compute[282206]: SandyBridge Feb 23 04:39:09 localhost nova_compute[282206]: SandyBridge-IBRS Feb 23 04:39:09 localhost nova_compute[282206]: SandyBridge-v1 Feb 23 04:39:09 localhost nova_compute[282206]: SandyBridge-v2 Feb 23 04:39:09 localhost nova_compute[282206]: SapphireRapids Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: SapphireRapids-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: SapphireRapids-v2 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: SapphireRapids-v3 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: SapphireRapids-v4 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: SierraForest Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: SierraForest-v1 Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:39:09 localhost nova_compute[282206]: Feb 23 04:41:03 localhost nova_compute[282206]: 2026-02-23 09:41:03.107 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:03 localhost rsyslogd[758]: imjournal: 8899 messages lost due to rate-limiting (20000 allowed within 600 seconds) Feb 23 04:41:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:41:06 localhost systemd[1]: tmp-crun.aLEaEc.mount: Deactivated successfully. Feb 23 04:41:06 localhost podman[285362]: 2026-02-23 09:41:06.903921966 +0000 UTC m=+0.080223461 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, version=9.7, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc.) Feb 23 04:41:06 localhost podman[285362]: 2026-02-23 09:41:06.94531229 +0000 UTC m=+0.121613725 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, managed_by=edpm_ansible) Feb 23 04:41:06 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:41:07 localhost nova_compute[282206]: 2026-02-23 09:41:07.540 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:08 localhost nova_compute[282206]: 2026-02-23 09:41:08.110 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:09 localhost podman[242954]: time="2026-02-23T09:41:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:41:09 localhost podman[242954]: @ - - [23/Feb/2026:09:41:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1" Feb 23 04:41:09 localhost podman[242954]: @ - - [23/Feb/2026:09:41:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16796 "" "Go-http-client/1.1" Feb 23 04:41:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8690 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF28060000000001030307) Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.258 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.259 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.285 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.285 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.285 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:41:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:41:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.587 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:12 localhost podman[285401]: 2026-02-23 09:41:12.610280432 +0000 UTC m=+0.125086834 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.612 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.613 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.613 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.613 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:41:12 localhost podman[285401]: 2026-02-23 09:41:12.622250863 +0000 UTC m=+0.137057275 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:41:12 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:41:12 localhost podman[285399]: 2026-02-23 09:41:12.683341009 +0000 UTC m=+0.201224276 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:41:12 localhost podman[285399]: 2026-02-23 09:41:12.716086795 +0000 UTC m=+0.233970092 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:41:12 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.961 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.989 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.989 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.990 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.990 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.991 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.991 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.991 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.991 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.992 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:41:12 localhost nova_compute[282206]: 2026-02-23 09:41:12.992 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.009 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.010 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.010 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.011 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.011 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.111 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:13 localhost openstack_network_exporter[245358]: ERROR 09:41:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:41:13 localhost openstack_network_exporter[245358]: Feb 23 04:41:13 localhost openstack_network_exporter[245358]: ERROR 09:41:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:41:13 localhost openstack_network_exporter[245358]: Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.472 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.538 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.539 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.778 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.781 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12330MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.781 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.782 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.881 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.881 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.882 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:41:13 localhost nova_compute[282206]: 2026-02-23 09:41:13.937 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:41:14 localhost nova_compute[282206]: 2026-02-23 09:41:14.396 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:41:14 localhost nova_compute[282206]: 2026-02-23 09:41:14.404 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:41:14 localhost nova_compute[282206]: 2026-02-23 09:41:14.426 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:41:14 localhost nova_compute[282206]: 2026-02-23 09:41:14.458 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:41:14 localhost nova_compute[282206]: 2026-02-23 09:41:14.459 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.677s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:41:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:41:15 localhost podman[285558]: 2026-02-23 09:41:15.908650516 +0000 UTC m=+0.082119479 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true) Feb 23 04:41:15 localhost podman[285558]: 2026-02-23 09:41:15.918533062 +0000 UTC m=+0.092002055 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0) Feb 23 04:41:15 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:41:17 localhost nova_compute[282206]: 2026-02-23 09:41:17.626 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:18 localhost nova_compute[282206]: 2026-02-23 09:41:18.115 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:41:21 localhost podman[285578]: 2026-02-23 09:41:21.905425167 +0000 UTC m=+0.081640116 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216) Feb 23 04:41:21 localhost snmpd[67690]: empty variable list in _query Feb 23 04:41:21 localhost snmpd[67690]: empty variable list in _query Feb 23 04:41:21 localhost podman[285578]: 2026-02-23 09:41:21.914309232 +0000 UTC m=+0.090524171 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Feb 23 04:41:21 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:41:22 localhost nova_compute[282206]: 2026-02-23 09:41:22.651 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:23 localhost nova_compute[282206]: 2026-02-23 09:41:23.117 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50868 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF5BFF0000000001030307) Feb 23 04:41:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50869 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF60060000000001030307) Feb 23 04:41:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8691 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF68060000000001030307) Feb 23 04:41:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50870 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF68060000000001030307) Feb 23 04:41:27 localhost nova_compute[282206]: 2026-02-23 09:41:27.693 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:28 localhost nova_compute[282206]: 2026-02-23 09:41:28.118 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:28 localhost sshd[285597]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:41:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27323 DF PROTO=TCP SPT=36860 DPT=9102 SEQ=3021588974 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF6C060000000001030307) Feb 23 04:41:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:41:29 localhost podman[285599]: 2026-02-23 09:41:29.91161492 +0000 UTC m=+0.086209217 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:41:29 localhost podman[285599]: 2026-02-23 09:41:29.928239975 +0000 UTC m=+0.102834262 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:41:29 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:41:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50871 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF77C60000000001030307) Feb 23 04:41:32 localhost nova_compute[282206]: 2026-02-23 09:41:32.731 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:33 localhost nova_compute[282206]: 2026-02-23 09:41:33.121 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:37 localhost nova_compute[282206]: 2026-02-23 09:41:37.756 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:41:37 localhost podman[285622]: 2026-02-23 09:41:37.901147904 +0000 UTC m=+0.077466665 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, distribution-scope=public, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 23 04:41:37 localhost podman[285622]: 2026-02-23 09:41:37.91615287 +0000 UTC m=+0.092471641 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, release=1770267347) Feb 23 04:41:37 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:41:38 localhost nova_compute[282206]: 2026-02-23 09:41:38.122 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:38 localhost sshd[285643]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:41:39 localhost podman[242954]: time="2026-02-23T09:41:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:41:39 localhost podman[242954]: @ - - [23/Feb/2026:09:41:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1" Feb 23 04:41:39 localhost podman[242954]: @ - - [23/Feb/2026:09:41:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16795 "" "Go-http-client/1.1" Feb 23 04:41:39 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50872 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFF98070000000001030307) Feb 23 04:41:42 localhost nova_compute[282206]: 2026-02-23 09:41:42.790 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:41:42 localhost podman[285645]: 2026-02-23 09:41:42.910761288 +0000 UTC m=+0.080973034 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:41:42 localhost podman[285646]: 2026-02-23 09:41:42.961553084 +0000 UTC m=+0.127934041 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:41:42 localhost podman[285645]: 2026-02-23 09:41:42.974289719 +0000 UTC m=+0.144501445 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 23 04:41:42 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:41:43 localhost podman[285646]: 2026-02-23 09:41:43.025107357 +0000 UTC m=+0.191488304 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:41:43 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:41:43 localhost nova_compute[282206]: 2026-02-23 09:41:43.124 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:43 localhost openstack_network_exporter[245358]: ERROR 09:41:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:41:43 localhost openstack_network_exporter[245358]: Feb 23 04:41:43 localhost openstack_network_exporter[245358]: ERROR 09:41:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:41:43 localhost openstack_network_exporter[245358]: Feb 23 04:41:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:41:46 localhost podman[285693]: 2026-02-23 09:41:46.903719288 +0000 UTC m=+0.080052095 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:41:46 localhost podman[285693]: 2026-02-23 09:41:46.943254145 +0000 UTC m=+0.119587002 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0) Feb 23 04:41:46 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:41:47 localhost nova_compute[282206]: 2026-02-23 09:41:47.817 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:48 localhost nova_compute[282206]: 2026-02-23 09:41:48.126 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:41:48.544 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:41:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:41:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:41:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:41:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:41:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:41:52 localhost nova_compute[282206]: 2026-02-23 09:41:52.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:52 localhost podman[285712]: 2026-02-23 09:41:52.910041253 +0000 UTC m=+0.085008819 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent) Feb 23 04:41:52 localhost podman[285712]: 2026-02-23 09:41:52.94376542 +0000 UTC m=+0.118732976 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:41:52 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:41:53 localhost nova_compute[282206]: 2026-02-23 09:41:53.129 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:53 localhost sshd[285730]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:41:53 localhost systemd-logind[759]: New session 61 of user zuul. Feb 23 04:41:53 localhost systemd[1]: Started Session 61 of User zuul. Feb 23 04:41:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16173 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFD1310000000001030307) Feb 23 04:41:54 localhost python3[285752]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:41:54 localhost subscription-manager[285753]: Unregistered machine with identity: 71d8a449-76d3-4525-90bb-1ec088bb454f Feb 23 04:41:54 localhost systemd-journald[47710]: Field hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation. Feb 23 04:41:54 localhost systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 04:41:54 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:41:54 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:41:54 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:41:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16174 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFD5470000000001030307) Feb 23 04:41:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50873 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFD8060000000001030307) Feb 23 04:41:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16175 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFDD470000000001030307) Feb 23 04:41:57 localhost nova_compute[282206]: 2026-02-23 09:41:57.840 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:58 localhost nova_compute[282206]: 2026-02-23 09:41:58.130 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:41:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8692 DF PROTO=TCP SPT=43934 DPT=9102 SEQ=1319889552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFE6060000000001030307) Feb 23 04:42:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:42:00 localhost podman[285756]: 2026-02-23 09:42:00.909545969 +0000 UTC m=+0.083077650 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:42:00 localhost podman[285756]: 2026-02-23 09:42:00.917554398 +0000 UTC m=+0.091086109 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:42:00 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:42:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16176 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3BFFED070000000001030307) Feb 23 04:42:02 localhost nova_compute[282206]: 2026-02-23 09:42:02.877 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:03 localhost nova_compute[282206]: 2026-02-23 09:42:03.131 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:03 localhost sshd[285779]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:42:07 localhost nova_compute[282206]: 2026-02-23 09:42:07.926 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:08 localhost nova_compute[282206]: 2026-02-23 09:42:08.132 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:42:08 localhost podman[285781]: 2026-02-23 09:42:08.900938167 +0000 UTC m=+0.076790338 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, release=1770267347, distribution-scope=public, version=9.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git) Feb 23 04:42:08 localhost podman[285781]: 2026-02-23 09:42:08.917294718 +0000 UTC m=+0.093146889 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., vcs-type=git, release=1770267347, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7) Feb 23 04:42:08 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:42:09 localhost podman[242954]: time="2026-02-23T09:42:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:42:09 localhost podman[242954]: @ - - [23/Feb/2026:09:42:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1" Feb 23 04:42:09 localhost podman[242954]: @ - - [23/Feb/2026:09:42:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16796 "" "Go-http-client/1.1" Feb 23 04:42:09 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16177 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C000E060000000001030307) Feb 23 04:42:12 localhost nova_compute[282206]: 2026-02-23 09:42:12.929 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:13 localhost nova_compute[282206]: 2026-02-23 09:42:13.134 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:13 localhost openstack_network_exporter[245358]: ERROR 09:42:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:42:13 localhost openstack_network_exporter[245358]: Feb 23 04:42:13 localhost openstack_network_exporter[245358]: ERROR 09:42:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:42:13 localhost openstack_network_exporter[245358]: Feb 23 04:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:42:13 localhost podman[285802]: 2026-02-23 09:42:13.904213193 +0000 UTC m=+0.077735597 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:42:13 localhost podman[285803]: 2026-02-23 09:42:13.981986571 +0000 UTC m=+0.151035036 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:42:13 localhost podman[285802]: 2026-02-23 09:42:13.989378462 +0000 UTC m=+0.162900886 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:42:14 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:42:14 localhost podman[285803]: 2026-02-23 09:42:14.042530272 +0000 UTC m=+0.211578757 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:42:14 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:42:14 localhost nova_compute[282206]: 2026-02-23 09:42:14.461 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:14 localhost nova_compute[282206]: 2026-02-23 09:42:14.462 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:14 localhost nova_compute[282206]: 2026-02-23 09:42:14.462 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:42:14 localhost nova_compute[282206]: 2026-02-23 09:42:14.462 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:42:14 localhost nova_compute[282206]: 2026-02-23 09:42:14.635 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:42:14 localhost nova_compute[282206]: 2026-02-23 09:42:14.636 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:42:14 localhost nova_compute[282206]: 2026-02-23 09:42:14.636 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:42:14 localhost nova_compute[282206]: 2026-02-23 09:42:14.636 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.015 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.059 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.060 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.061 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.061 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.061 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.062 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.062 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.062 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.063 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.063 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.099 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.100 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.100 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.101 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.101 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.636 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.535s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.702 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.703 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.916 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.918 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12328MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.919 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:42:15 localhost nova_compute[282206]: 2026-02-23 09:42:15.919 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.017 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.017 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.018 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.051 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.507 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.513 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.535 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.537 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:42:16 localhost nova_compute[282206]: 2026-02-23 09:42:16.538 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:42:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:42:17 localhost podman[285982]: 2026-02-23 09:42:17.528275176 +0000 UTC m=+0.077429519 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216) Feb 23 04:42:17 localhost podman[285982]: 2026-02-23 09:42:17.539836777 +0000 UTC m=+0.088991170 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2) Feb 23 04:42:17 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:42:17 localhost nova_compute[282206]: 2026-02-23 09:42:17.965 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:18 localhost nova_compute[282206]: 2026-02-23 09:42:18.135 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:23 localhost nova_compute[282206]: 2026-02-23 09:42:23.011 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:23 localhost nova_compute[282206]: 2026-02-23 09:42:23.137 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:42:23 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Feb 23 04:42:23 localhost podman[286002]: 2026-02-23 09:42:23.360605442 +0000 UTC m=+0.094314416 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:42:23 localhost podman[286002]: 2026-02-23 09:42:23.365302009 +0000 UTC m=+0.099011023 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:42:23 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:42:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10004 DF PROTO=TCP SPT=38402 DPT=9102 SEQ=3427797769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C0046600000000001030307) Feb 23 04:42:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10005 DF PROTO=TCP SPT=38402 DPT=9102 SEQ=3427797769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C004A860000000001030307) Feb 23 04:42:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16178 DF PROTO=TCP SPT=43118 DPT=9102 SEQ=3047473331 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C004E060000000001030307) Feb 23 04:42:26 localhost sshd[286074]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:42:26 localhost systemd-logind[759]: New session 62 of user tripleo-admin. Feb 23 04:42:26 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 23 04:42:26 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 23 04:42:26 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 23 04:42:26 localhost systemd[1]: Starting User Manager for UID 1003... Feb 23 04:42:26 localhost systemd[286078]: Queued start job for default target Main User Target. Feb 23 04:42:26 localhost systemd[286078]: Created slice User Application Slice. Feb 23 04:42:26 localhost systemd[286078]: Started Mark boot as successful after the user session has run 2 minutes. Feb 23 04:42:26 localhost systemd[286078]: Started Daily Cleanup of User's Temporary Directories. Feb 23 04:42:26 localhost systemd[286078]: Reached target Paths. Feb 23 04:42:26 localhost systemd[286078]: Reached target Timers. Feb 23 04:42:26 localhost systemd[286078]: Starting D-Bus User Message Bus Socket... Feb 23 04:42:26 localhost systemd[286078]: Starting Create User's Volatile Files and Directories... Feb 23 04:42:26 localhost systemd[286078]: Listening on D-Bus User Message Bus Socket. Feb 23 04:42:26 localhost systemd[286078]: Reached target Sockets. Feb 23 04:42:26 localhost systemd[286078]: Finished Create User's Volatile Files and Directories. Feb 23 04:42:26 localhost systemd[286078]: Reached target Basic System. Feb 23 04:42:26 localhost systemd[286078]: Reached target Main User Target. Feb 23 04:42:26 localhost systemd[286078]: Startup finished in 150ms. Feb 23 04:42:26 localhost systemd[1]: Started User Manager for UID 1003. Feb 23 04:42:26 localhost systemd[1]: Started Session 62 of User tripleo-admin. Feb 23 04:42:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10006 DF PROTO=TCP SPT=38402 DPT=9102 SEQ=3427797769 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C0052860000000001030307) Feb 23 04:42:27 localhost python3[286222]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)#012add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"#012# 100 ceph_dashboard (8443)#012add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"#012# 100 ceph_grafana (3100)#012add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"#012# 100 ceph_prometheus (9092)#012add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"#012# 100 ceph_rgw (8080)#012add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"#012# 110 ceph_mon (6789, 3300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"#012# 112 ceph_mds (6800-7300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"#012# 113 ceph_mgr (6800-7300, 8444)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"#012# 120 ceph_nfs (2049, 12049)#012add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"#012# 123 ceph_dashboard (9090, 9094, 9283)#012add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"#012 insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:42:28 localhost sshd[286278]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:42:28 localhost nova_compute[282206]: 2026-02-23 09:42:28.059 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:8b:ea:9c MACDST=fa:16:3e:f0:80:57 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50874 DF PROTO=TCP SPT=41664 DPT=9102 SEQ=200366359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A3C0056070000000001030307) Feb 23 04:42:28 localhost nova_compute[282206]: 2026-02-23 09:42:28.138 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:28 localhost python3[286368]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Feb 23 04:42:28 localhost systemd[1]: Stopping Netfilter Tables... Feb 23 04:42:28 localhost systemd[1]: nftables.service: Deactivated successfully. Feb 23 04:42:28 localhost systemd[1]: Stopped Netfilter Tables. Feb 23 04:42:28 localhost systemd[1]: Starting Netfilter Tables... Feb 23 04:42:28 localhost systemd[1]: Finished Netfilter Tables. Feb 23 04:42:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:42:31 localhost systemd[1]: tmp-crun.2S977Q.mount: Deactivated successfully. Feb 23 04:42:31 localhost podman[286393]: 2026-02-23 09:42:31.923484349 +0000 UTC m=+0.095956217 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:42:31 localhost podman[286393]: 2026-02-23 09:42:31.959290446 +0000 UTC m=+0.131762304 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:42:31 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:42:33 localhost nova_compute[282206]: 2026-02-23 09:42:33.087 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:33 localhost nova_compute[282206]: 2026-02-23 09:42:33.139 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:36 localhost sshd[286435]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:42:38 localhost nova_compute[282206]: 2026-02-23 09:42:38.128 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:38 localhost nova_compute[282206]: 2026-02-23 09:42:38.140 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:42:39 localhost podman[286491]: 2026-02-23 09:42:39.352442018 +0000 UTC m=+0.081263298 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z) Feb 23 04:42:39 localhost podman[242954]: time="2026-02-23T09:42:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:42:39 localhost podman[242954]: @ - - [23/Feb/2026:09:42:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148535 "" "Go-http-client/1.1" Feb 23 04:42:39 localhost podman[286491]: 2026-02-23 09:42:39.447179166 +0000 UTC m=+0.176000436 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vendor=Red Hat, Inc., distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=) Feb 23 04:42:39 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:42:39 localhost podman[242954]: @ - - [23/Feb/2026:09:42:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16804 "" "Go-http-client/1.1" Feb 23 04:42:43 localhost nova_compute[282206]: 2026-02-23 09:42:43.142 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:42:43 localhost nova_compute[282206]: 2026-02-23 09:42:43.143 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:42:43 localhost nova_compute[282206]: 2026-02-23 09:42:43.144 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:42:43 localhost nova_compute[282206]: 2026-02-23 09:42:43.144 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:42:43 localhost nova_compute[282206]: 2026-02-23 09:42:43.157 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:43 localhost nova_compute[282206]: 2026-02-23 09:42:43.158 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:42:43 localhost openstack_network_exporter[245358]: ERROR 09:42:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:42:43 localhost openstack_network_exporter[245358]: Feb 23 04:42:43 localhost openstack_network_exporter[245358]: ERROR 09:42:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:42:43 localhost openstack_network_exporter[245358]: Feb 23 04:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:42:44 localhost systemd[1]: tmp-crun.rmFztn.mount: Deactivated successfully. Feb 23 04:42:44 localhost podman[286548]: 2026-02-23 09:42:44.905634543 +0000 UTC m=+0.076230051 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:42:44 localhost podman[286547]: 2026-02-23 09:42:44.878732583 +0000 UTC m=+0.056661690 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 04:42:44 localhost podman[286548]: 2026-02-23 09:42:44.940170881 +0000 UTC m=+0.110766369 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:42:44 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:42:44 localhost podman[286547]: 2026-02-23 09:42:44.962327293 +0000 UTC m=+0.140256410 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible) Feb 23 04:42:44 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:42:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:42:47 localhost podman[286600]: 2026-02-23 09:42:47.923014906 +0000 UTC m=+0.093448629 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:42:47 localhost podman[286600]: 2026-02-23 09:42:47.962274572 +0000 UTC m=+0.132708305 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0) Feb 23 04:42:47 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:42:48 localhost nova_compute[282206]: 2026-02-23 09:42:48.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:42:48 localhost nova_compute[282206]: 2026-02-23 09:42:48.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:42:48 localhost nova_compute[282206]: 2026-02-23 09:42:48.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:42:48 localhost nova_compute[282206]: 2026-02-23 09:42:48.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:42:48 localhost nova_compute[282206]: 2026-02-23 09:42:48.194 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:48 localhost nova_compute[282206]: 2026-02-23 09:42:48.195 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:42:48 localhost podman[286692]: Feb 23 04:42:48 localhost podman[286692]: 2026-02-23 09:42:48.51546771 +0000 UTC m=+0.072913577 container create 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, architecture=x86_64, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=) Feb 23 04:42:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:42:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:42:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:42:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:42:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:42:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:42:48 localhost systemd[1]: Started libpod-conmon-4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7.scope. Feb 23 04:42:48 localhost podman[286692]: 2026-02-23 09:42:48.484994039 +0000 UTC m=+0.042439936 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:42:48 localhost systemd[1]: Started libcrun container. Feb 23 04:42:48 localhost podman[286692]: 2026-02-23 09:42:48.601497836 +0000 UTC m=+0.158943703 container init 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, name=rhceph, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, release=1770267347, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public) Feb 23 04:42:48 localhost podman[286692]: 2026-02-23 09:42:48.610460096 +0000 UTC m=+0.167905973 container start 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, distribution-scope=public, io.buildah.version=1.42.2, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, architecture=x86_64, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph) Feb 23 04:42:48 localhost podman[286692]: 2026-02-23 09:42:48.610734064 +0000 UTC m=+0.168179971 container attach 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, architecture=x86_64, ceph=True, version=7, build-date=2026-02-09T10:25:24Z, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph) Feb 23 04:42:48 localhost elegant_spence[286707]: 167 167 Feb 23 04:42:48 localhost systemd[1]: libpod-4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7.scope: Deactivated successfully. Feb 23 04:42:48 localhost podman[286692]: 2026-02-23 09:42:48.615854484 +0000 UTC m=+0.173300351 container died 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True) Feb 23 04:42:48 localhost podman[286712]: 2026-02-23 09:42:48.707851616 +0000 UTC m=+0.080449642 container remove 4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_spence, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, release=1770267347, io.openshift.expose-services=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=) Feb 23 04:42:48 localhost systemd[1]: libpod-conmon-4ef5448866c00f05962e8baf2b844f36608497aac14600eb9a7a994d7ca525d7.scope: Deactivated successfully. Feb 23 04:42:48 localhost systemd[1]: Reloading. Feb 23 04:42:48 localhost systemd-sysv-generator[286758]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:42:48 localhost systemd-rc-local-generator[286753]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:48 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: var-lib-containers-storage-overlay-90b778e891eff2f457563f8bd9e27a554e43b9b8578aabf5f635bd483d3c32b5-merged.mount: Deactivated successfully. Feb 23 04:42:49 localhost systemd[1]: Reloading. Feb 23 04:42:49 localhost systemd-rc-local-generator[286794]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:42:49 localhost systemd-sysv-generator[286799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:42:49 localhost systemd[1]: Starting Ceph mds.mds.np0005626463.qcthuc for f1fea371-cb69-578d-a3d0-b5c472a84b46... Feb 23 04:42:49 localhost podman[286858]: Feb 23 04:42:49 localhost podman[286858]: 2026-02-23 09:42:49.814592385 +0000 UTC m=+0.063851764 container create 35c397f376b989389f5487b314924f02dd848f945c70656bc276f291652231c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , distribution-scope=public, GIT_BRANCH=main, RELEASE=main, release=1770267347, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:42:49 localhost systemd[1]: tmp-crun.jiUAtI.mount: Deactivated successfully. Feb 23 04:42:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3db296b006fd64b095e1b104925bc0aecc9c34c3dc9a003a02759c089e28f2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 04:42:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3db296b006fd64b095e1b104925bc0aecc9c34c3dc9a003a02759c089e28f2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 04:42:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3db296b006fd64b095e1b104925bc0aecc9c34c3dc9a003a02759c089e28f2c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 04:42:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3db296b006fd64b095e1b104925bc0aecc9c34c3dc9a003a02759c089e28f2c/merged/var/lib/ceph/mds/ceph-mds.np0005626463.qcthuc supports timestamps until 2038 (0x7fffffff) Feb 23 04:42:49 localhost podman[286858]: 2026-02-23 09:42:49.880009307 +0000 UTC m=+0.129268686 container init 35c397f376b989389f5487b314924f02dd848f945c70656bc276f291652231c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, release=1770267347, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:42:49 localhost podman[286858]: 2026-02-23 09:42:49.783375731 +0000 UTC m=+0.032635160 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:42:49 localhost podman[286858]: 2026-02-23 09:42:49.890474504 +0000 UTC m=+0.139733883 container start 35c397f376b989389f5487b314924f02dd848f945c70656bc276f291652231c2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:42:49 localhost bash[286858]: 35c397f376b989389f5487b314924f02dd848f945c70656bc276f291652231c2 Feb 23 04:42:49 localhost systemd[1]: Started Ceph mds.mds.np0005626463.qcthuc for f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 04:42:49 localhost ceph-mds[286877]: set uid:gid to 167:167 (ceph:ceph) Feb 23 04:42:49 localhost ceph-mds[286877]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mds, pid 2 Feb 23 04:42:49 localhost ceph-mds[286877]: main not setting numa affinity Feb 23 04:42:49 localhost ceph-mds[286877]: pidfile_write: ignore empty --pid-file Feb 23 04:42:49 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc[286873]: starting mds.mds.np0005626463.qcthuc at Feb 23 04:42:49 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 9 from mon.1 Feb 23 04:42:50 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 10 from mon.1 Feb 23 04:42:50 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc Monitors have assigned me to become a standby. Feb 23 04:42:52 localhost podman[287077]: 2026-02-23 09:42:52.054568592 +0000 UTC m=+0.088890717 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux , version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, description=Red Hat Ceph Storage 7, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.42.2, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vendor=Red Hat, Inc.) Feb 23 04:42:52 localhost podman[287077]: 2026-02-23 09:42:52.162322585 +0000 UTC m=+0.196644740 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:42:53 localhost nova_compute[282206]: 2026-02-23 09:42:53.196 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:42:53 localhost nova_compute[282206]: 2026-02-23 09:42:53.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:42:53 localhost nova_compute[282206]: 2026-02-23 09:42:53.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:42:53 localhost nova_compute[282206]: 2026-02-23 09:42:53.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:42:53 localhost nova_compute[282206]: 2026-02-23 09:42:53.216 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:42:53 localhost nova_compute[282206]: 2026-02-23 09:42:53.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:42:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:42:53 localhost podman[287197]: 2026-02-23 09:42:53.585378568 +0000 UTC m=+0.092301572 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:42:53 localhost podman[287197]: 2026-02-23 09:42:53.615831979 +0000 UTC m=+0.122754973 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:42:53 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:42:54 localhost systemd[1]: session-61.scope: Deactivated successfully. Feb 23 04:42:54 localhost systemd-logind[759]: Session 61 logged out. Waiting for processes to exit. Feb 23 04:42:54 localhost systemd-logind[759]: Removed session 61. Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.135 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.142 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4aaf9f2e-74b2-4fb6-a1a5-a9a6d6f13a3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.136299', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '081069de-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '5004d5749f8398215a45450249584f5b11d2cf334cfce85b07d22aa52eeeb41e'}]}, 'timestamp': '2026-02-23 09:42:56.143503', '_unique_id': '0a039fb2933c4daab25802401af3fada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.144 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.146 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.178 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.179 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8713182-ca09-4b43-ac43-ff2572069154', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.146526', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0815dd74-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'f8da7f2f74754f91027cbe6d8d9a4b7d11b4755f261fad52ba81689d46fa29af'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.146526', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0815eee0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '9e762d2506229680f6babf1359f5a19f5e41fd7db9e2f039fa245e68d359f464'}]}, 'timestamp': '2026-02-23 09:42:56.179582', '_unique_id': 'a0b0444ba2a74038a8c55bd33ecd1194'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.180 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.182 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e606598-91e4-4330-b974-a327d77ee553', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.182078', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '08166104-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '22c44ac3b6758376916dee428e35ffb14e91fcbf84e3b23578b550e9ccd625d4'}]}, 'timestamp': '2026-02-23 09:42:56.182531', '_unique_id': '905ca80190324939a3a2f41f350c54bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.183 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.184 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '996d5aae-ff70-4878-88bf-b81e7f64d9d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.184640', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '0816c784-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '85ec92f7e1ac08582c6603c4165e9c0a97fbc4c6dccb2082acc0146d109248f7'}]}, 'timestamp': '2026-02-23 09:42:56.185159', '_unique_id': 'dded05689609423497f47511a26a6736'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.186 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.187 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 9700000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d4e83ee-ca4b-4695-8951-0634518fe67c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9700000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:42:56.187252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '081a3eaa-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.396759027, 'message_signature': '84c1b1f10e26d9c2d7f324d48d3955e3f5d92dba676e3c76fdbd8f59efbee1fa'}]}, 'timestamp': '2026-02-23 09:42:56.207857', '_unique_id': 'c8e28c1c2e8b4ccf95ea6ab5e1aa2706'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.221 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.222 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24981149-fc73-432d-b042-6bb0a0d3e4ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.210026', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081c6716-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '1818b50183424b042bbc0399f0698642d97c77c12f37ce3824cbe391e85e3729'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.210026', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081c795e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '8c96c267f3cfa2fbb40bc788cce16a437a1eedbf42b8de22cab9efdf63493183'}]}, 'timestamp': '2026-02-23 09:42:56.222446', '_unique_id': '0ce8cdaeacb047fda1e9f250543a2f76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.223 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45f95f31-45cf-45d1-82be-68773a8b9b6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.224911', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '081cebdc-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '3df5b2a51dee9eac3b75e48feddb3ccc1d82570e14b93a9c026a60ed9b14a868'}]}, 'timestamp': '2026-02-23 09:42:56.225411', '_unique_id': 'eab912c8af8948e498505754776f9906'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.227 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b17ad395-4a94-4f80-9692-174ef6dd8139', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.227643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081d5662-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '649054c31e892406dcd07576fd8662fb6735d82ab8e8f58df3db825d22a333dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.227643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081d66a2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'fa9b248c7baaf607629b0d15a9ffafd526ad18c313951f52e1f12efb2817b370'}]}, 'timestamp': '2026-02-23 09:42:56.228517', '_unique_id': 'd33283c0969f46b198a466d95849daae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.230 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfdc84b7-e5fb-42e3-a010-403818eaf69f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:42:56.230659', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '081dcc0a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.396759027, 'message_signature': '75243a3bb81d2850059da707eea61dd7406b03c34ac9a7b760f00c41027de211'}]}, 'timestamp': '2026-02-23 09:42:56.231128', '_unique_id': '5eccb86279004ea6b41fe1d4ebd3d321'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6564a07a-c4ee-418d-9fec-7a2da5189e32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.233188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081e2fd8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'a409e70945339c934059cc9b4e4ab685a03e4d2d4ced2913f07c1e1a8d1a17a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.233188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081e40f4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '1aa00b07a24c31ea886ebca43bfed9976d7a5e58f3185f6d0a348c3f514910ce'}]}, 'timestamp': '2026-02-23 09:42:56.234110', '_unique_id': 'bef1a1d958944020aee2777643b28a81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3105c122-33fb-4c3c-b6fa-45a76a662d12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.236223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081ea3dc-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'f16c11c1bcf09df6d13569d36d5cfeef06740c708a2348b39be2f547c186524a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.236223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081eb39a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'e47025324de5ea8066e8ed106bcb86526eba788bacc773e7bef958594f9b63db'}]}, 'timestamp': '2026-02-23 09:42:56.237076', '_unique_id': 'f852bd1bef3340b8aa8c0fde8d3e99d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35b31cfd-049d-48c6-a268-e95fcf7f7366', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.239207', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '081f18a8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '94b95e523751144288d6dfb24687e792d154330471d92ff1c81989d6dbaea508'}]}, 'timestamp': '2026-02-23 09:42:56.239656', '_unique_id': '9d8da660cbd74c5dbc2763325accba47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2974971b-ae44-4118-bd0e-a4fb2cb025c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.241744', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '081f7d16-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '837af4ea1562a7e1fbb3ce76ec9a0c79a6556e9a8847261605f6c3cd7cf097c4'}]}, 'timestamp': '2026-02-23 09:42:56.242225', '_unique_id': 'e4998c743ce84d5bb1b22153e6ea1ea2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21d9aa53-0676-4c64-82e6-11d4b14804c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.244410', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '081fe36e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '6de8348f7f9f15e59e7f8127929fddf2e29c6b0c821a7a2877c02664f5ccb9a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.244410', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '081ff53e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'd8cbab1a69c8b5e42d30545f7fc0c524eaa12f3015be398a30cae9958de8361b'}]}, 'timestamp': '2026-02-23 09:42:56.245276', '_unique_id': '364720da7ce44f13a2269ca3ae454997'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1dc19f0-7ef1-4064-a7a2-2a60445bab81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.247399', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '0820589e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '5068d327d66850cf633d7ba5f87df6c40721509e17397978e662343c06074ede'}]}, 'timestamp': '2026-02-23 09:42:56.247845', '_unique_id': '46178c2f09334a138c88234d42441874'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e2e6ae7-fae5-4daa-a07b-436c31a6c457', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.249960', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '0820bca8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '0a1cd6259351da43a6defe76bbc4ae0042b87ce4d7d4ab81a7871884b1f22d34'}]}, 'timestamp': '2026-02-23 09:42:56.250407', '_unique_id': 'bc1da6c52bdf4acc83506dd3023e049a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e0c9629-60e6-42e1-9753-4e3159bb4172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.252455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '08211dd8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '241d8b71d35dbb5dd35066193dc97e815cb86f5bab3686700e70602d9cfa42bf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.252455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '08212f44-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '515cf0fc49c964b1f9bd4a4df54ade23cbecec062ac58368d233917a1a985adf'}]}, 'timestamp': '2026-02-23 09:42:56.253317', '_unique_id': '7a86fbc30d444bdd8f2184c4e71ab51d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db47753a-1dbb-4f2c-8a4e-384cc5f22849', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.255443', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '082192c2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': 'aba981380a9062cb80cf93f22ce63c5582bab10078967c8fb4bba8ef89405724'}]}, 'timestamp': '2026-02-23 09:42:56.255924', '_unique_id': '3eeabad324444401bfbb2b2d54a83a51'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '972313fa-3958-4b5d-b986-8e6a5ea34419', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.257995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0821f64a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': 'd8938349b17c793e71986b3336ca0d6923875617edf70b1dfa1f5d04b1f8f5b7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.257995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0822059a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.336008021, 'message_signature': '796da8ad8633975a2e8ab53b5d949220e18792f45f3caae462985880ca30a234'}]}, 'timestamp': '2026-02-23 09:42:56.258801', '_unique_id': 'b1ca91d681c84d3c8c3669086444c1ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.260 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60b5a15a-eacb-4de9-b6b4-d260e310abed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:42:56.260960', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '08226a26-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': '5ac16555c8ca7d90c6f9f6d2aa9f5ff4aaffdc3c5b449a5ff05dfb901e1711df'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:42:56.260960', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '08227ac0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.399504873, 'message_signature': 'bef25ac0dc028d86cd061b5992586371595f07f9ed684d6ffabafb59792f97aa'}]}, 'timestamp': '2026-02-23 09:42:56.261801', '_unique_id': '194ec8f5545149678ce65ba0f0f79b81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4dc75e9-4a64-4933-9f88-dfa4075b36e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:42:56.263333', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '0822c3e0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11216.325815073, 'message_signature': '4e178897efedb771b8143adb00d239491f8f95092d8534012de6f787848e9975'}]}, 'timestamp': '2026-02-23 09:42:56.263613', '_unique_id': '2a8c24e048fa4ff9ba82b798a2758c7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:42:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:42:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 04:42:58 localhost nova_compute[282206]: 2026-02-23 09:42:58.218 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:42:58 localhost nova_compute[282206]: 2026-02-23 09:42:58.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:42:58 localhost nova_compute[282206]: 2026-02-23 09:42:58.240 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:42:58 localhost nova_compute[282206]: 2026-02-23 09:42:58.240 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:42:58 localhost nova_compute[282206]: 2026-02-23 09:42:58.241 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:42:58 localhost nova_compute[282206]: 2026-02-23 09:42:58.242 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:43:02 localhost podman[287217]: 2026-02-23 09:43:02.902290362 +0000 UTC m=+0.076806418 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:43:02 localhost podman[287217]: 2026-02-23 09:43:02.915263878 +0000 UTC m=+0.089779914 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:43:02 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:43:03 localhost nova_compute[282206]: 2026-02-23 09:43:03.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:03 localhost nova_compute[282206]: 2026-02-23 09:43:03.245 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:03 localhost nova_compute[282206]: 2026-02-23 09:43:03.245 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:03 localhost nova_compute[282206]: 2026-02-23 09:43:03.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:03 localhost nova_compute[282206]: 2026-02-23 09:43:03.279 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:03 localhost nova_compute[282206]: 2026-02-23 09:43:03.280 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:07 localhost sshd[287238]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:43:08 localhost nova_compute[282206]: 2026-02-23 09:43:08.280 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:08 localhost nova_compute[282206]: 2026-02-23 09:43:08.282 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:08 localhost nova_compute[282206]: 2026-02-23 09:43:08.282 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:08 localhost nova_compute[282206]: 2026-02-23 09:43:08.283 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:08 localhost nova_compute[282206]: 2026-02-23 09:43:08.314 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:08 localhost nova_compute[282206]: 2026-02-23 09:43:08.315 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:09 localhost podman[242954]: time="2026-02-23T09:43:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:43:09 localhost podman[242954]: @ - - [23/Feb/2026:09:43:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150741 "" "Go-http-client/1.1" Feb 23 04:43:09 localhost podman[242954]: @ - - [23/Feb/2026:09:43:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17288 "" "Go-http-client/1.1" Feb 23 04:43:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:43:09 localhost podman[287240]: 2026-02-23 09:43:09.905424699 +0000 UTC m=+0.078625246 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1770267347, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 04:43:09 localhost podman[287240]: 2026-02-23 09:43:09.914508142 +0000 UTC m=+0.087708739 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:43:09 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:43:12 localhost nova_compute[282206]: 2026-02-23 09:43:12.126 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:12 localhost nova_compute[282206]: 2026-02-23 09:43:12.145 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:12 localhost nova_compute[282206]: 2026-02-23 09:43:12.146 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:43:12 localhost nova_compute[282206]: 2026-02-23 09:43:12.146 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:43:12 localhost nova_compute[282206]: 2026-02-23 09:43:12.677 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:43:12 localhost nova_compute[282206]: 2026-02-23 09:43:12.678 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:43:12 localhost nova_compute[282206]: 2026-02-23 09:43:12.678 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:43:12 localhost nova_compute[282206]: 2026-02-23 09:43:12.678 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.058 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.079 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.080 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.082 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.082 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.083 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.109 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.110 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.110 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.111 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.111 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.315 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.318 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.318 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.318 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.337 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.338 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:13 localhost openstack_network_exporter[245358]: ERROR 09:43:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:43:13 localhost openstack_network_exporter[245358]: Feb 23 04:43:13 localhost openstack_network_exporter[245358]: ERROR 09:43:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:43:13 localhost openstack_network_exporter[245358]: Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.571 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.632 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.632 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.857 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.859 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12305MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.859 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.860 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.931 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.932 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.932 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:43:13 localhost nova_compute[282206]: 2026-02-23 09:43:13.968 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:43:14 localhost nova_compute[282206]: 2026-02-23 09:43:14.422 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:43:14 localhost nova_compute[282206]: 2026-02-23 09:43:14.429 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:43:14 localhost nova_compute[282206]: 2026-02-23 09:43:14.445 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:43:14 localhost nova_compute[282206]: 2026-02-23 09:43:14.449 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:43:14 localhost nova_compute[282206]: 2026-02-23 09:43:14.450 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.590s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:43:15 localhost podman[287304]: 2026-02-23 09:43:15.969647886 +0000 UTC m=+0.144920636 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Feb 23 04:43:15 localhost podman[287304]: 2026-02-23 09:43:15.999357713 +0000 UTC m=+0.174630493 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible) Feb 23 04:43:16 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:43:16 localhost podman[287305]: 2026-02-23 09:43:15.950692934 +0000 UTC m=+0.121638019 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:43:16 localhost podman[287305]: 2026-02-23 09:43:16.083258332 +0000 UTC m=+0.254203467 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:43:16 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:43:16 localhost nova_compute[282206]: 2026-02-23 09:43:16.375 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:16 localhost nova_compute[282206]: 2026-02-23 09:43:16.376 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:43:18 localhost nova_compute[282206]: 2026-02-23 09:43:18.339 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:18 localhost nova_compute[282206]: 2026-02-23 09:43:18.360 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:18 localhost nova_compute[282206]: 2026-02-23 09:43:18.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5022 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:18 localhost nova_compute[282206]: 2026-02-23 09:43:18.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:18 localhost nova_compute[282206]: 2026-02-23 09:43:18.362 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:18 localhost nova_compute[282206]: 2026-02-23 09:43:18.363 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:18 localhost sshd[287420]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:43:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:43:18 localhost systemd[1]: tmp-crun.RHNRpw.mount: Deactivated successfully. Feb 23 04:43:18 localhost podman[287422]: 2026-02-23 09:43:18.913813653 +0000 UTC m=+0.087387809 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:43:18 localhost podman[287422]: 2026-02-23 09:43:18.929292777 +0000 UTC m=+0.102866923 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 23 04:43:18 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:43:23 localhost nova_compute[282206]: 2026-02-23 09:43:23.363 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:23 localhost nova_compute[282206]: 2026-02-23 09:43:23.388 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:23 localhost nova_compute[282206]: 2026-02-23 09:43:23.388 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5025 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:23 localhost nova_compute[282206]: 2026-02-23 09:43:23.389 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:23 localhost nova_compute[282206]: 2026-02-23 09:43:23.390 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:23 localhost nova_compute[282206]: 2026-02-23 09:43:23.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:43:23 localhost podman[287459]: 2026-02-23 09:43:23.908136901 +0000 UTC m=+0.081378271 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:43:23 localhost podman[287459]: 2026-02-23 09:43:23.943195136 +0000 UTC m=+0.116436466 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS) Feb 23 04:43:23 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:43:26 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 13 from mon.1 Feb 23 04:43:26 localhost ceph-mds[286877]: mds.0.13 handle_mds_map i am now mds.0.13 Feb 23 04:43:26 localhost ceph-mds[286877]: mds.0.13 handle_mds_map state change up:standby --> up:replay Feb 23 04:43:26 localhost ceph-mds[286877]: mds.0.13 replay_start Feb 23 04:43:26 localhost ceph-mds[286877]: mds.0.13 waiting for osdmap 79 (which blocklists prior instance) Feb 23 04:43:26 localhost ceph-mds[286877]: mds.0.cache creating system inode with ino:0x100 Feb 23 04:43:26 localhost ceph-mds[286877]: mds.0.cache creating system inode with ino:0x1 Feb 23 04:43:26 localhost ceph-mds[286877]: mds.0.13 Finished replaying journal Feb 23 04:43:26 localhost ceph-mds[286877]: mds.0.13 making mds journal writeable Feb 23 04:43:27 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 14 from mon.1 Feb 23 04:43:27 localhost ceph-mds[286877]: mds.0.13 handle_mds_map i am now mds.0.13 Feb 23 04:43:27 localhost ceph-mds[286877]: mds.0.13 handle_mds_map state change up:replay --> up:reconnect Feb 23 04:43:27 localhost ceph-mds[286877]: mds.0.13 reconnect_start Feb 23 04:43:27 localhost ceph-mds[286877]: mds.0.13 reopen_log Feb 23 04:43:27 localhost ceph-mds[286877]: mds.0.13 reconnect_done Feb 23 04:43:28 localhost nova_compute[282206]: 2026-02-23 09:43:28.394 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:28 localhost nova_compute[282206]: 2026-02-23 09:43:28.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:28 localhost nova_compute[282206]: 2026-02-23 09:43:28.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:28 localhost nova_compute[282206]: 2026-02-23 09:43:28.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:28 localhost nova_compute[282206]: 2026-02-23 09:43:28.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:28 localhost nova_compute[282206]: 2026-02-23 09:43:28.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:28 localhost systemd[1]: session-62.scope: Deactivated successfully. Feb 23 04:43:28 localhost systemd[1]: session-62.scope: Consumed 1.283s CPU time. Feb 23 04:43:28 localhost systemd-logind[759]: Session 62 logged out. Waiting for processes to exit. Feb 23 04:43:28 localhost systemd-logind[759]: Removed session 62. Feb 23 04:43:28 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 15 from mon.1 Feb 23 04:43:28 localhost ceph-mds[286877]: mds.0.13 handle_mds_map i am now mds.0.13 Feb 23 04:43:28 localhost ceph-mds[286877]: mds.0.13 handle_mds_map state change up:reconnect --> up:rejoin Feb 23 04:43:28 localhost ceph-mds[286877]: mds.0.13 rejoin_start Feb 23 04:43:28 localhost ceph-mds[286877]: mds.0.13 rejoin_joint_start Feb 23 04:43:28 localhost ceph-mds[286877]: mds.0.13 rejoin_done Feb 23 04:43:29 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc Updating MDS map to version 16 from mon.1 Feb 23 04:43:29 localhost ceph-mds[286877]: mds.0.13 handle_mds_map i am now mds.0.13 Feb 23 04:43:29 localhost ceph-mds[286877]: mds.0.13 handle_mds_map state change up:rejoin --> up:active Feb 23 04:43:29 localhost ceph-mds[286877]: mds.0.13 recovery_done -- successful recovery! Feb 23 04:43:29 localhost ceph-mds[286877]: mds.0.13 active_start Feb 23 04:43:29 localhost ceph-mds[286877]: mds.0.13 cluster recovered. Feb 23 04:43:31 localhost ceph-mds[286877]: mds.pinger is_rank_lagging: rank=0 was never sent ping request. Feb 23 04:43:31 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mds-mds-np0005626463-qcthuc[286873]: 2026-02-23T09:43:30.999+0000 7f2bb035b640 -1 mds.pinger is_rank_lagging: rank=0 was never sent ping request. Feb 23 04:43:33 localhost nova_compute[282206]: 2026-02-23 09:43:33.411 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:33 localhost nova_compute[282206]: 2026-02-23 09:43:33.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:33 localhost nova_compute[282206]: 2026-02-23 09:43:33.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:33 localhost nova_compute[282206]: 2026-02-23 09:43:33.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:33 localhost nova_compute[282206]: 2026-02-23 09:43:33.448 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:33 localhost nova_compute[282206]: 2026-02-23 09:43:33.449 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:43:33 localhost podman[287491]: 2026-02-23 09:43:33.895324512 +0000 UTC m=+0.070294336 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:43:33 localhost podman[287491]: 2026-02-23 09:43:33.904346094 +0000 UTC m=+0.079315988 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:43:33 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:43:38 localhost nova_compute[282206]: 2026-02-23 09:43:38.449 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:38 localhost nova_compute[282206]: 2026-02-23 09:43:38.451 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:38 localhost nova_compute[282206]: 2026-02-23 09:43:38.451 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:38 localhost nova_compute[282206]: 2026-02-23 09:43:38.451 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:38 localhost nova_compute[282206]: 2026-02-23 09:43:38.487 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:38 localhost nova_compute[282206]: 2026-02-23 09:43:38.487 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:38 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 23 04:43:38 localhost systemd[286078]: Activating special unit Exit the Session... Feb 23 04:43:38 localhost systemd[286078]: Stopped target Main User Target. Feb 23 04:43:38 localhost systemd[286078]: Stopped target Basic System. Feb 23 04:43:38 localhost systemd[286078]: Stopped target Paths. Feb 23 04:43:38 localhost systemd[286078]: Stopped target Sockets. Feb 23 04:43:38 localhost systemd[286078]: Stopped target Timers. Feb 23 04:43:38 localhost systemd[286078]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 23 04:43:38 localhost systemd[286078]: Stopped Daily Cleanup of User's Temporary Directories. Feb 23 04:43:38 localhost systemd[286078]: Closed D-Bus User Message Bus Socket. Feb 23 04:43:38 localhost systemd[286078]: Stopped Create User's Volatile Files and Directories. Feb 23 04:43:38 localhost systemd[286078]: Removed slice User Application Slice. Feb 23 04:43:38 localhost systemd[286078]: Reached target Shutdown. Feb 23 04:43:38 localhost systemd[286078]: Finished Exit the Session. Feb 23 04:43:38 localhost systemd[286078]: Reached target Exit the Session. Feb 23 04:43:38 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 23 04:43:38 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 23 04:43:38 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 23 04:43:38 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 23 04:43:38 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 23 04:43:38 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 23 04:43:38 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 23 04:43:38 localhost systemd[1]: user-1003.slice: Consumed 1.633s CPU time. Feb 23 04:43:39 localhost podman[242954]: time="2026-02-23T09:43:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:43:39 localhost podman[242954]: @ - - [23/Feb/2026:09:43:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150741 "" "Go-http-client/1.1" Feb 23 04:43:39 localhost podman[242954]: @ - - [23/Feb/2026:09:43:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17295 "" "Go-http-client/1.1" Feb 23 04:43:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:43:40 localhost systemd[1]: tmp-crun.GxErOw.mount: Deactivated successfully. Feb 23 04:43:40 localhost podman[287516]: 2026-02-23 09:43:40.907092437 +0000 UTC m=+0.083046083 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:43:40 localhost podman[287516]: 2026-02-23 09:43:40.94434157 +0000 UTC m=+0.120295186 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter) Feb 23 04:43:40 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:43:43 localhost openstack_network_exporter[245358]: ERROR 09:43:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:43:43 localhost openstack_network_exporter[245358]: Feb 23 04:43:43 localhost openstack_network_exporter[245358]: ERROR 09:43:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:43:43 localhost openstack_network_exporter[245358]: Feb 23 04:43:43 localhost nova_compute[282206]: 2026-02-23 09:43:43.488 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:43 localhost nova_compute[282206]: 2026-02-23 09:43:43.489 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:43 localhost nova_compute[282206]: 2026-02-23 09:43:43.489 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:43 localhost nova_compute[282206]: 2026-02-23 09:43:43.489 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:43 localhost nova_compute[282206]: 2026-02-23 09:43:43.490 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:43 localhost nova_compute[282206]: 2026-02-23 09:43:43.491 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:44 localhost sshd[287571]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:43:46 localhost podman[287591]: 2026-02-23 09:43:46.909590767 +0000 UTC m=+0.084654794 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 04:43:46 localhost podman[287591]: 2026-02-23 09:43:46.944273639 +0000 UTC m=+0.119337686 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 04:43:46 localhost systemd[1]: tmp-crun.LGTarI.mount: Deactivated successfully. Feb 23 04:43:46 localhost podman[287592]: 2026-02-23 09:43:46.958934348 +0000 UTC m=+0.130926369 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:43:46 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:43:46 localhost podman[287592]: 2026-02-23 09:43:46.996524021 +0000 UTC m=+0.168516042 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:43:47 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:43:48 localhost nova_compute[282206]: 2026-02-23 09:43:48.493 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:48 localhost nova_compute[282206]: 2026-02-23 09:43:48.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:48 localhost nova_compute[282206]: 2026-02-23 09:43:48.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:48 localhost nova_compute[282206]: 2026-02-23 09:43:48.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:48 localhost nova_compute[282206]: 2026-02-23 09:43:48.524 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:48 localhost nova_compute[282206]: 2026-02-23 09:43:48.526 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:43:48.545 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:43:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:43:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:43:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:43:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:43:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:43:49 localhost podman[287640]: 2026-02-23 09:43:49.93308932 +0000 UTC m=+0.108661973 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216) Feb 23 04:43:49 localhost podman[287640]: 2026-02-23 09:43:49.946343234 +0000 UTC m=+0.121915837 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:43:49 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:43:53 localhost nova_compute[282206]: 2026-02-23 09:43:53.527 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:53 localhost nova_compute[282206]: 2026-02-23 09:43:53.529 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:53 localhost nova_compute[282206]: 2026-02-23 09:43:53.529 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:53 localhost nova_compute[282206]: 2026-02-23 09:43:53.529 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:53 localhost nova_compute[282206]: 2026-02-23 09:43:53.556 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:53 localhost nova_compute[282206]: 2026-02-23 09:43:53.557 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:43:54 localhost podman[287658]: 2026-02-23 09:43:54.945647739 +0000 UTC m=+0.116517099 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible) Feb 23 04:43:54 localhost podman[287658]: 2026-02-23 09:43:54.975245512 +0000 UTC m=+0.146114882 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:43:54 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:43:58 localhost nova_compute[282206]: 2026-02-23 09:43:58.558 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:58 localhost nova_compute[282206]: 2026-02-23 09:43:58.590 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:43:58 localhost nova_compute[282206]: 2026-02-23 09:43:58.591 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:43:58 localhost nova_compute[282206]: 2026-02-23 09:43:58.591 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:43:58 localhost nova_compute[282206]: 2026-02-23 09:43:58.592 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:43:58 localhost nova_compute[282206]: 2026-02-23 09:43:58.593 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:03 localhost nova_compute[282206]: 2026-02-23 09:44:03.593 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:03 localhost nova_compute[282206]: 2026-02-23 09:44:03.596 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:03 localhost nova_compute[282206]: 2026-02-23 09:44:03.597 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:03 localhost nova_compute[282206]: 2026-02-23 09:44:03.597 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:03 localhost nova_compute[282206]: 2026-02-23 09:44:03.630 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:03 localhost nova_compute[282206]: 2026-02-23 09:44:03.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:44:04 localhost podman[287675]: 2026-02-23 09:44:04.903389089 +0000 UTC m=+0.079740040 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:44:04 localhost podman[287675]: 2026-02-23 09:44:04.915131405 +0000 UTC m=+0.091482386 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:44:04 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:44:08 localhost sshd[287716]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:44:08 localhost nova_compute[282206]: 2026-02-23 09:44:08.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:08 localhost nova_compute[282206]: 2026-02-23 09:44:08.662 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:08 localhost nova_compute[282206]: 2026-02-23 09:44:08.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:08 localhost nova_compute[282206]: 2026-02-23 09:44:08.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:08 localhost nova_compute[282206]: 2026-02-23 09:44:08.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:08 localhost nova_compute[282206]: 2026-02-23 09:44:08.664 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:09 localhost nova_compute[282206]: 2026-02-23 09:44:09.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:09 localhost nova_compute[282206]: 2026-02-23 09:44:09.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:44:09 localhost nova_compute[282206]: 2026-02-23 09:44:09.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:44:09 localhost podman[242954]: time="2026-02-23T09:44:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:44:09 localhost podman[242954]: @ - - [23/Feb/2026:09:44:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150741 "" "Go-http-client/1.1" Feb 23 04:44:09 localhost podman[242954]: @ - - [23/Feb/2026:09:44:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17296 "" "Go-http-client/1.1" Feb 23 04:44:09 localhost nova_compute[282206]: 2026-02-23 09:44:09.705 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:44:09 localhost nova_compute[282206]: 2026-02-23 09:44:09.705 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:44:09 localhost nova_compute[282206]: 2026-02-23 09:44:09.706 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:44:09 localhost nova_compute[282206]: 2026-02-23 09:44:09.706 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.195 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.216 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.217 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.218 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.218 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.236 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.236 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.237 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 23 04:44:10 localhost nova_compute[282206]: 2026-02-23 09:44:10.300 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:44:11 localhost systemd[1]: tmp-crun.snwVdA.mount: Deactivated successfully. Feb 23 04:44:11 localhost podman[287790]: 2026-02-23 09:44:11.131131989 +0000 UTC m=+0.072041140 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:44:11 localhost podman[287790]: 2026-02-23 09:44:11.140254663 +0000 UTC m=+0.081163834 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-type=git, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:44:11 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:44:11 localhost nova_compute[282206]: 2026-02-23 09:44:11.183 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:11 localhost podman[287853]: Feb 23 04:44:11 localhost podman[287853]: 2026-02-23 09:44:11.613412594 +0000 UTC m=+0.076421316 container create f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, description=Red Hat Ceph Storage 7, version=7, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:44:11 localhost systemd[1]: Started libpod-conmon-f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236.scope. Feb 23 04:44:11 localhost systemd[1]: Started libcrun container. Feb 23 04:44:11 localhost podman[287853]: 2026-02-23 09:44:11.583579953 +0000 UTC m=+0.046588705 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:44:11 localhost podman[287853]: 2026-02-23 09:44:11.6908009 +0000 UTC m=+0.153809622 container init f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, release=1770267347, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:44:11 localhost podman[287853]: 2026-02-23 09:44:11.701265427 +0000 UTC m=+0.164274149 container start f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, version=7, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:44:11 localhost podman[287853]: 2026-02-23 09:44:11.701551246 +0000 UTC m=+0.164560048 container attach f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, name=rhceph, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:44:11 localhost friendly_moser[287868]: 167 167 Feb 23 04:44:11 localhost systemd[1]: libpod-f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236.scope: Deactivated successfully. Feb 23 04:44:11 localhost podman[287853]: 2026-02-23 09:44:11.705941872 +0000 UTC m=+0.168950644 container died f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, RELEASE=main, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7) Feb 23 04:44:11 localhost podman[287873]: 2026-02-23 09:44:11.825964969 +0000 UTC m=+0.107613760 container remove f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_moser, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:44:11 localhost systemd[1]: libpod-conmon-f476c343f6d9f7f505fb02d81d9861b282cea2723641b7844bdc2099792e4236.scope: Deactivated successfully. Feb 23 04:44:11 localhost systemd[1]: Reloading. Feb 23 04:44:11 localhost systemd-rc-local-generator[287913]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:44:12 localhost systemd-sysv-generator[287919]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost nova_compute[282206]: 2026-02-23 09:44:12.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:12 localhost nova_compute[282206]: 2026-02-23 09:44:12.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:12 localhost nova_compute[282206]: 2026-02-23 09:44:12.057 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:12 localhost nova_compute[282206]: 2026-02-23 09:44:12.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: var-lib-containers-storage-overlay-d05beb6bd41232dd856a5c5ba15720c8e82328032deb98cbf61da7e6a6c1578c-merged.mount: Deactivated successfully. Feb 23 04:44:12 localhost systemd[1]: Reloading. Feb 23 04:44:12 localhost systemd-sysv-generator[287958]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:44:12 localhost systemd-rc-local-generator[287954]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:12 localhost systemd[1]: Starting Ceph mgr.np0005626463.wtksup for f1fea371-cb69-578d-a3d0-b5c472a84b46... Feb 23 04:44:12 localhost podman[288018]: Feb 23 04:44:12 localhost podman[288018]: 2026-02-23 09:44:12.861437894 +0000 UTC m=+0.056254047 container create bb57b00fb4ffcc023092512e750b21cd585f41b4202d5c160e4cae01fa164fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, architecture=x86_64, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc.) Feb 23 04:44:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda07ed9ae315dcbb981d4f298b8689abd147eab4ae29411d44ea165dd566889/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda07ed9ae315dcbb981d4f298b8689abd147eab4ae29411d44ea165dd566889/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda07ed9ae315dcbb981d4f298b8689abd147eab4ae29411d44ea165dd566889/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fda07ed9ae315dcbb981d4f298b8689abd147eab4ae29411d44ea165dd566889/merged/var/lib/ceph/mgr/ceph-np0005626463.wtksup supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:12 localhost podman[288018]: 2026-02-23 09:44:12.911888979 +0000 UTC m=+0.106705122 container init bb57b00fb4ffcc023092512e750b21cd585f41b4202d5c160e4cae01fa164fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup, io.buildah.version=1.42.2, ceph=True, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, release=1770267347, version=7, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=) Feb 23 04:44:12 localhost podman[288018]: 2026-02-23 09:44:12.920526249 +0000 UTC m=+0.115342402 container start bb57b00fb4ffcc023092512e750b21cd585f41b4202d5c160e4cae01fa164fb4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, name=rhceph, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph) Feb 23 04:44:12 localhost bash[288018]: bb57b00fb4ffcc023092512e750b21cd585f41b4202d5c160e4cae01fa164fb4 Feb 23 04:44:12 localhost podman[288018]: 2026-02-23 09:44:12.838969493 +0000 UTC m=+0.033785636 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:44:12 localhost systemd[1]: Started Ceph mgr.np0005626463.wtksup for f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 04:44:12 localhost ceph-mgr[288036]: set uid:gid to 167:167 (ceph:ceph) Feb 23 04:44:12 localhost ceph-mgr[288036]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mgr, pid 2 Feb 23 04:44:12 localhost ceph-mgr[288036]: pidfile_write: ignore empty --pid-file Feb 23 04:44:12 localhost ceph-mgr[288036]: mgr[py] Loading python module 'alerts' Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:13 localhost ceph-mgr[288036]: mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 23 04:44:13 localhost ceph-mgr[288036]: mgr[py] Loading python module 'balancer' Feb 23 04:44:13 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:13.078+0000 7f4486b65140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.085 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.085 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.085 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.086 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.086 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:44:13 localhost ceph-mgr[288036]: mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 23 04:44:13 localhost ceph-mgr[288036]: mgr[py] Loading python module 'cephadm' Feb 23 04:44:13 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:13.144+0000 7f4486b65140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 23 04:44:13 localhost openstack_network_exporter[245358]: ERROR 09:44:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:44:13 localhost openstack_network_exporter[245358]: Feb 23 04:44:13 localhost openstack_network_exporter[245358]: ERROR 09:44:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:44:13 localhost openstack_network_exporter[245358]: Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.584 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.647 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.648 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.664 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:13 localhost ceph-mgr[288036]: mgr[py] Loading python module 'crash' Feb 23 04:44:13 localhost ceph-mgr[288036]: mgr[py] Module crash has missing NOTIFY_TYPES member Feb 23 04:44:13 localhost ceph-mgr[288036]: mgr[py] Loading python module 'dashboard' Feb 23 04:44:13 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:13.844+0000 7f4486b65140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.858 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.860 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=12256MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.860 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.861 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.963 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.964 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:44:13 localhost nova_compute[282206]: 2026-02-23 09:44:13.964 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.016 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.107 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.108 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.131 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.160 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.209 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Loading python module 'devicehealth' Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Loading python module 'diskprediction_local' Feb 23 04:44:14 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:14.386+0000 7f4486b65140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 23 04:44:14 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Feb 23 04:44:14 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Feb 23 04:44:14 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: from numpy import show_config as show_numpy_config Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Loading python module 'influx' Feb 23 04:44:14 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:14.528+0000 7f4486b65140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Module influx has missing NOTIFY_TYPES member Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Loading python module 'insights' Feb 23 04:44:14 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:14.587+0000 7f4486b65140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Loading python module 'iostat' Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.661 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.669 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.688 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.690 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:44:14 localhost nova_compute[282206]: 2026-02-23 09:44:14.691 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Loading python module 'k8sevents' Feb 23 04:44:14 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:14.699+0000 7f4486b65140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 23 04:44:14 localhost ceph-mgr[288036]: mgr[py] Loading python module 'localpool' Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'mds_autoscaler' Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'mirroring' Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'nfs' Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'orchestrator' Feb 23 04:44:15 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.399+0000 7f4486b65140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'osd_perf_query' Feb 23 04:44:15 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.538+0000 7f4486b65140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'osd_support' Feb 23 04:44:15 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.601+0000 7f4486b65140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'pg_autoscaler' Feb 23 04:44:15 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.655+0000 7f4486b65140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost nova_compute[282206]: 2026-02-23 09:44:15.688 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'progress' Feb 23 04:44:15 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.720+0000 7f4486b65140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Module progress has missing NOTIFY_TYPES member Feb 23 04:44:15 localhost ceph-mgr[288036]: mgr[py] Loading python module 'prometheus' Feb 23 04:44:15 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:15.777+0000 7f4486b65140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost nova_compute[282206]: 2026-02-23 09:44:16.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Loading python module 'rbd_support' Feb 23 04:44:16 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.066+0000 7f4486b65140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Loading python module 'restful' Feb 23 04:44:16 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.145+0000 7f4486b65140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Loading python module 'rgw' Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Loading python module 'rook' Feb 23 04:44:16 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.503+0000 7f4486b65140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Module rook has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Loading python module 'selftest' Feb 23 04:44:16 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.908+0000 7f4486b65140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 23 04:44:16 localhost ceph-mgr[288036]: mgr[py] Loading python module 'snap_schedule' Feb 23 04:44:16 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:16.969+0000 7f4486b65140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Loading python module 'stats' Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Loading python module 'status' Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Module status has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Loading python module 'telegraf' Feb 23 04:44:17 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.160+0000 7f4486b65140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Loading python module 'telemetry' Feb 23 04:44:17 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.218+0000 7f4486b65140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Loading python module 'test_orchestrator' Feb 23 04:44:17 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.351+0000 7f4486b65140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Loading python module 'volumes' Feb 23 04:44:17 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.498+0000 7f4486b65140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Loading python module 'zabbix' Feb 23 04:44:17 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.683+0000 7f4486b65140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:44:17.743+0000 7f4486b65140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 23 04:44:17 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 23 04:44:17 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:6800/920472675 Feb 23 04:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:44:17 localhost podman[288126]: 2026-02-23 09:44:17.932100998 +0000 UTC m=+0.097114230 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:44:18 localhost systemd[1]: tmp-crun.V3arwi.mount: Deactivated successfully. Feb 23 04:44:18 localhost podman[288127]: 2026-02-23 09:44:18.011561724 +0000 UTC m=+0.176929587 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:44:18 localhost podman[288126]: 2026-02-23 09:44:18.015226007 +0000 UTC m=+0.180239259 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.schema-version=1.0) Feb 23 04:44:18 localhost podman[288127]: 2026-02-23 09:44:18.023182251 +0000 UTC m=+0.188550074 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:44:18 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:44:18 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:44:18 localhost nova_compute[282206]: 2026-02-23 09:44:18.667 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:19 localhost systemd[1]: tmp-crun.tI7NCp.mount: Deactivated successfully. Feb 23 04:44:19 localhost podman[288281]: 2026-02-23 09:44:19.00982418 +0000 UTC m=+0.099975608 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Feb 23 04:44:19 localhost podman[288281]: 2026-02-23 09:44:19.109414273 +0000 UTC m=+0.199565711 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vcs-type=git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, distribution-scope=public) Feb 23 04:44:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:44:20 localhost podman[288466]: 2026-02-23 09:44:20.914918665 +0000 UTC m=+0.086631858 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 04:44:20 localhost podman[288466]: 2026-02-23 09:44:20.930370318 +0000 UTC m=+0.102083501 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:44:20 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:44:21 localhost sshd[288487]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:44:23 localhost nova_compute[282206]: 2026-02-23 09:44:23.672 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:23 localhost nova_compute[282206]: 2026-02-23 09:44:23.674 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:23 localhost nova_compute[282206]: 2026-02-23 09:44:23.675 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:23 localhost nova_compute[282206]: 2026-02-23 09:44:23.675 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:23 localhost nova_compute[282206]: 2026-02-23 09:44:23.716 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:23 localhost nova_compute[282206]: 2026-02-23 09:44:23.717 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:44:25 localhost podman[289164]: 2026-02-23 09:44:25.374033375 +0000 UTC m=+0.085474872 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:44:25 localhost podman[289164]: 2026-02-23 09:44:25.379897256 +0000 UTC m=+0.091338773 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:44:25 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:44:28 localhost nova_compute[282206]: 2026-02-23 09:44:28.718 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:28 localhost nova_compute[282206]: 2026-02-23 09:44:28.720 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:28 localhost nova_compute[282206]: 2026-02-23 09:44:28.720 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:28 localhost nova_compute[282206]: 2026-02-23 09:44:28.720 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:28 localhost nova_compute[282206]: 2026-02-23 09:44:28.762 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:28 localhost nova_compute[282206]: 2026-02-23 09:44:28.763 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:31 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e4f20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 23 04:44:33 localhost nova_compute[282206]: 2026-02-23 09:44:33.763 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:33 localhost nova_compute[282206]: 2026-02-23 09:44:33.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:33 localhost nova_compute[282206]: 2026-02-23 09:44:33.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:33 localhost nova_compute[282206]: 2026-02-23 09:44:33.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:33 localhost nova_compute[282206]: 2026-02-23 09:44:33.816 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:33 localhost nova_compute[282206]: 2026-02-23 09:44:33.816 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:44:35 localhost podman[289184]: 2026-02-23 09:44:35.903443682 +0000 UTC m=+0.078870970 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:44:35 localhost podman[289184]: 2026-02-23 09:44:35.915150931 +0000 UTC m=+0.090578209 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:44:35 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:44:36 localhost ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors Feb 23 04:44:37 localhost podman[289287]: Feb 23 04:44:37 localhost podman[289287]: 2026-02-23 09:44:37.598394322 +0000 UTC m=+0.072693001 container create 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, architecture=x86_64, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True) Feb 23 04:44:37 localhost systemd[1]: Started libpod-conmon-386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9.scope. Feb 23 04:44:37 localhost systemd[1]: Started libcrun container. Feb 23 04:44:37 localhost podman[289287]: 2026-02-23 09:44:37.567078822 +0000 UTC m=+0.041377541 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:44:37 localhost podman[289287]: 2026-02-23 09:44:37.676709434 +0000 UTC m=+0.151008113 container init 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, vcs-type=git, RELEASE=main, architecture=x86_64, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.buildah.version=1.42.2, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:44:37 localhost podman[289287]: 2026-02-23 09:44:37.68634638 +0000 UTC m=+0.160645079 container start 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, version=7, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main) Feb 23 04:44:37 localhost podman[289287]: 2026-02-23 09:44:37.686639589 +0000 UTC m=+0.160945148 container attach 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Feb 23 04:44:37 localhost compassionate_bhaskara[289302]: 167 167 Feb 23 04:44:37 localhost systemd[1]: libpod-386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9.scope: Deactivated successfully. Feb 23 04:44:37 localhost podman[289307]: 2026-02-23 09:44:37.760838164 +0000 UTC m=+0.056363760 container died 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, release=1770267347, name=rhceph, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, version=7) Feb 23 04:44:37 localhost podman[289307]: 2026-02-23 09:44:37.795255959 +0000 UTC m=+0.090781495 container remove 386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_bhaskara, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, distribution-scope=public, vcs-type=git, RELEASE=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2) Feb 23 04:44:37 localhost systemd[1]: libpod-conmon-386b7497eb3c18ad0c0d129dc9fd4bb398dd9808ba346e5d17dce258d6f4ecd9.scope: Deactivated successfully. Feb 23 04:44:37 localhost podman[289324]: Feb 23 04:44:37 localhost podman[289324]: 2026-02-23 09:44:37.89407008 +0000 UTC m=+0.068766620 container create a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, RELEASE=main, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc.) Feb 23 04:44:37 localhost systemd[1]: Started libpod-conmon-a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f.scope. Feb 23 04:44:37 localhost systemd[1]: Started libcrun container. Feb 23 04:44:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149c9e966aad0d2944300a07deaa8ea741c20dedccfa29c7e8f1268d650045ce/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149c9e966aad0d2944300a07deaa8ea741c20dedccfa29c7e8f1268d650045ce/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149c9e966aad0d2944300a07deaa8ea741c20dedccfa29c7e8f1268d650045ce/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/149c9e966aad0d2944300a07deaa8ea741c20dedccfa29c7e8f1268d650045ce/merged/var/lib/ceph/mon/ceph-np0005626463 supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:37 localhost podman[289324]: 2026-02-23 09:44:37.948664184 +0000 UTC m=+0.123360724 container init a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public) Feb 23 04:44:37 localhost podman[289324]: 2026-02-23 09:44:37.958276509 +0000 UTC m=+0.132973049 container start a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, build-date=2026-02-09T10:25:24Z, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-type=git, GIT_BRANCH=main, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc.) Feb 23 04:44:37 localhost podman[289324]: 2026-02-23 09:44:37.958518456 +0000 UTC m=+0.133215026 container attach a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, CEPH_POINT_RELEASE=, release=1770267347, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main) Feb 23 04:44:37 localhost podman[289324]: 2026-02-23 09:44:37.869746074 +0000 UTC m=+0.044442614 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:44:38 localhost systemd[1]: libpod-a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f.scope: Deactivated successfully. Feb 23 04:44:38 localhost podman[289324]: 2026-02-23 09:44:38.054656505 +0000 UTC m=+0.229353075 container died a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, distribution-scope=public, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, architecture=x86_64) Feb 23 04:44:38 localhost podman[289366]: 2026-02-23 09:44:38.143722806 +0000 UTC m=+0.079869331 container remove a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_yalow, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, name=rhceph, distribution-scope=public) Feb 23 04:44:38 localhost systemd[1]: libpod-conmon-a4a57159a55c33e21db6f47ec55e653b5f5903a5592f26551d3ad121f7f59f5f.scope: Deactivated successfully. Feb 23 04:44:38 localhost systemd[1]: Reloading. Feb 23 04:44:38 localhost systemd-sysv-generator[289405]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:44:38 localhost systemd-rc-local-generator[289402]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5080 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: var-lib-containers-storage-overlay-43ba382414e8e5720309cffa5ee892596c9f325953822f0ed270cb7544612577-merged.mount: Deactivated successfully. Feb 23 04:44:38 localhost systemd[1]: Reloading. Feb 23 04:44:38 localhost systemd-rc-local-generator[289445]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:44:38 localhost systemd-sysv-generator[289452]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:44:38 localhost nova_compute[282206]: 2026-02-23 09:44:38.868 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:38 localhost nova_compute[282206]: 2026-02-23 09:44:38.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:38 localhost nova_compute[282206]: 2026-02-23 09:44:38.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5055 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:38 localhost nova_compute[282206]: 2026-02-23 09:44:38.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:38 localhost nova_compute[282206]: 2026-02-23 09:44:38.873 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:38 localhost nova_compute[282206]: 2026-02-23 09:44:38.873 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:38 localhost systemd[1]: Starting Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46... Feb 23 04:44:39 localhost podman[289510]: Feb 23 04:44:39 localhost podman[289510]: 2026-02-23 09:44:39.231729273 +0000 UTC m=+0.073881807 container create 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, release=1770267347, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z) Feb 23 04:44:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c/merged/var/lib/ceph/mon/ceph-np0005626463 supports timestamps until 2038 (0x7fffffff) Feb 23 04:44:39 localhost podman[289510]: 2026-02-23 09:44:39.201465014 +0000 UTC m=+0.043617568 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:44:39 localhost podman[289510]: 2026-02-23 09:44:39.303825544 +0000 UTC m=+0.145978078 container init 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, RELEASE=main, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main) Feb 23 04:44:39 localhost podman[289510]: 2026-02-23 09:44:39.315591305 +0000 UTC m=+0.157743849 container start 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, release=1770267347, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z) Feb 23 04:44:39 localhost bash[289510]: 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 Feb 23 04:44:39 localhost systemd[1]: Started Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 04:44:39 localhost ceph-mon[289530]: set uid:gid to 167:167 (ceph:ceph) Feb 23 04:44:39 localhost ceph-mon[289530]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mon, pid 2 Feb 23 04:44:39 localhost ceph-mon[289530]: pidfile_write: ignore empty --pid-file Feb 23 04:44:39 localhost ceph-mon[289530]: load: jerasure load: lrc Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: RocksDB version: 7.9.2 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Git sha 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: DB SUMMARY Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: DB Session ID: 5E7LZX0BBD5RWYSEID7U Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: CURRENT file: CURRENT Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: IDENTITY file: IDENTITY Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005626463/store.db dir, Total Num: 0, files: Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005626463/store.db: 000004.log size: 886 ; Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.error_if_exists: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.create_if_missing: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.paranoid_checks: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.env: 0x5571fa965a20 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.fs: PosixFileSystem Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.info_log: 0x5571fbc72d20 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_file_opening_threads: 16 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.statistics: (nil) Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.use_fsync: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_log_file_size: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.log_file_time_to_roll: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.keep_log_file_num: 1000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.recycle_log_file_num: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.allow_fallocate: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.allow_mmap_reads: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.allow_mmap_writes: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.use_direct_reads: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.create_missing_column_families: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.db_log_dir: Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.wal_dir: Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.table_cache_numshardbits: 6 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.advise_random_on_open: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.db_write_buffer_size: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.write_buffer_manager: 0x5571fbc83540 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.use_adaptive_mutex: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.rate_limiter: (nil) Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.wal_recovery_mode: 2 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.enable_thread_tracking: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.enable_pipelined_write: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.unordered_write: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.row_cache: None Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.wal_filter: None Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.allow_ingest_behind: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.two_write_queues: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.manual_wal_flush: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.wal_compression: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.atomic_flush: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.persist_stats_to_disk: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.log_readahead_size: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.best_efforts_recovery: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.allow_data_in_errors: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.db_host_id: __hostname__ Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.enforce_single_del_contracts: true Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_background_jobs: 2 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_background_compactions: -1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_subcompactions: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.delayed_write_rate : 16777216 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_total_wal_size: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.stats_dump_period_sec: 600 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.stats_persist_period_sec: 600 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_open_files: -1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bytes_per_sync: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_readahead_size: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_background_flushes: -1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Compression algorithms supported: Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: #011kZSTD supported: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: #011kXpressCompression supported: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: #011kBZip2Compression supported: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: #011kLZ4Compression supported: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: #011kZlibCompression supported: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: #011kSnappyCompression supported: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: DMutex implementation: pthread_mutex_t Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005626463/store.db/MANIFEST-000005 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.merge_operator: Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_filter: None Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_filter_factory: None Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.sst_partitioner_factory: None Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5571fbc72980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5571fbc6f350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.write_buffer_size: 33554432 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_write_buffer_number: 2 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression: NoCompression Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression: Disabled Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.prefix_extractor: nullptr Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.num_levels: 7 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.level: 32767 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.enabled: false Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_base: 268435456 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.arena_block_size: 1048576 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.table_properties_collectors: Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.inplace_update_support: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.bloom_locality: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.max_successive_merges: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.force_consistency_checks: 1 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.ttl: 2592000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.enable_blob_files: false Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.min_blob_size: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.blob_file_size: 268435456 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005626463/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 3d1e4b58-ab15-4081-a9da-984e46fdc8b2 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839879374784, "job": 1, "event": "recovery_started", "wal_files": [4]} Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839879377093, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839879, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3d1e4b58-ab15-4081-a9da-984e46fdc8b2", "db_session_id": "5E7LZX0BBD5RWYSEID7U", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839879377207, "job": 1, "event": "recovery_finished"} Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5571fbc96e00 Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: DB pointer 0x5571fbd8c000 Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463 does not exist in monmap, will attempt to join an existing cluster Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:44:39 localhost ceph-mon[289530]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.96 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Sum 1/0 1.96 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.8 0.00 0.00 1 0.002 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5571fbc6f350#2 capacity: 512.00 MB usage: 1.30 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(2,1.08 KB,0.000205636%)#012#012** File Read Latency Histogram By Level [default] ** Feb 23 04:44:39 localhost ceph-mon[289530]: using public_addr v2:172.18.0.106:0/0 -> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] Feb 23 04:44:39 localhost ceph-mon[289530]: starting mon.np0005626463 rank -1 at public addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] at bind addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005626463 fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(???) e0 preinit fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 Feb 23 04:44:39 localhost podman[242954]: time="2026-02-23T09:44:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing) e5 sync_obtain_latest_monmap Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing) e5 sync_obtain_latest_monmap obtained monmap e5 Feb 23 04:44:39 localhost podman[242954]: @ - - [23/Feb/2026:09:44:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 23 04:44:39 localhost podman[242954]: @ - - [23/Feb/2026:09:44:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18253 "" "Go-http-client/1.1" Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing).mds e17 new map Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing).mds e17 print_map#012e17#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01116#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-23T07:57:46.097663+0000#012modified#0112026-02-23T09:43:29.529267+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01179#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26518}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26518 members: 26518#012[mds.mds.np0005626463.qcthuc{0:26518} state up:active seq 13 addr [v2:172.18.0.106:6808/2515508693,v1:172.18.0.106:6809/2515508693] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005626465.drvnoy{-1:26498} state up:standby seq 1 addr [v2:172.18.0.107:6808/2939113664,v1:172.18.0.107:6809/2939113664] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005626466.vaywlp{-1:26506} state up:standby seq 1 addr [v2:172.18.0.108:6808/2035422599,v1:172.18.0.108:6809/2035422599] compat {c=[1],r=[1],i=[17ff]}] Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing).osd e80 crush map has features 3314933000854323200, adjusting msgr requires Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing).osd e80 crush map has features 432629239337189376, adjusting msgr requires Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing).osd e80 crush map has features 432629239337189376, adjusting msgr requires Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing).osd e80 crush map has features 432629239337189376, adjusting msgr requires Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mgr to host np0005626463.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: Adjusting osd_memory_target on np0005626463.localdomain to 3396M Feb 23 04:44:39 localhost ceph-mon[289530]: Adjusting osd_memory_target on np0005626465.localdomain to 3396M Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Adjusting osd_memory_target on np0005626466.localdomain to 3396M Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mgr to host np0005626465.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mgr to host np0005626466.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Saving service mgr spec with placement label:mgr Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 23 04:44:39 localhost ceph-mon[289530]: Deploying daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 23 04:44:39 localhost ceph-mon[289530]: Deploying daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mon to host np0005626459.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label _admin to host np0005626459.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Feb 23 04:44:39 localhost ceph-mon[289530]: Deploying daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mon to host np0005626460.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label _admin to host np0005626460.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mon to host np0005626461.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label _admin to host np0005626461.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mon to host np0005626463.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label _admin to host np0005626463.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:44:39 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mon to host np0005626465.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:44:39 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label _admin to host np0005626465.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label mon to host np0005626466.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:44:39 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: Added label _admin to host np0005626466.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:44:39 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: Saving service mon spec with placement label:mon Feb 23 04:44:39 localhost ceph-mon[289530]: Deploying daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: Deploying daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626461 calling monitor election Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626459 calling monitor election Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626460 calling monitor election Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626466 calling monitor election Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626459 is new leader, mons np0005626459,np0005626461,np0005626460,np0005626466 in quorum (ranks 0,1,2,3) Feb 23 04:44:39 localhost ceph-mon[289530]: overall HEALTH_OK Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:39 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:44:39 localhost ceph-mon[289530]: Deploying daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:44:39 localhost ceph-mon[289530]: mon.np0005626463@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3 Feb 23 04:44:39 localhost systemd[1]: tmp-crun.ITiuMp.mount: Deactivated successfully. Feb 23 04:44:39 localhost sshd[289569]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:44:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:44:41 localhost systemd[1]: tmp-crun.uJOoUB.mount: Deactivated successfully. Feb 23 04:44:41 localhost podman[289571]: 2026-02-23 09:44:41.313258109 +0000 UTC m=+0.084441031 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64) Feb 23 04:44:41 localhost podman[289571]: 2026-02-23 09:44:41.325459094 +0000 UTC m=+0.096642066 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, version=9.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:44:41 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:44:43 localhost openstack_network_exporter[245358]: ERROR 09:44:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:44:43 localhost openstack_network_exporter[245358]: Feb 23 04:44:43 localhost openstack_network_exporter[245358]: ERROR 09:44:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:44:43 localhost openstack_network_exporter[245358]: Feb 23 04:44:43 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e51e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 23 04:44:43 localhost nova_compute[282206]: 2026-02-23 09:44:43.873 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:43 localhost nova_compute[282206]: 2026-02-23 09:44:43.877 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:45 localhost podman[289721]: 2026-02-23 09:44:45.021610317 +0000 UTC m=+0.073809685 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, ceph=True, name=rhceph) Feb 23 04:44:45 localhost podman[289721]: 2026-02-23 09:44:45.1074557 +0000 UTC m=+0.159655058 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:44:45 localhost ceph-mon[289530]: mon.np0005626463@-1(probing) e6 my rank is now 5 (was -1) Feb 23 04:44:45 localhost ceph-mon[289530]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:44:45 localhost ceph-mon[289530]: paxos.5).electionLogic(0) init, first boot, initializing epoch at 1 Feb 23 04:44:45 localhost ceph-mon[289530]: mon.np0005626463@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:44:48 localhost ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors Feb 23 04:44:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:44:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:44:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:44:48.546 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:44:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:44:48.547 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626463@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626463@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626463@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626459 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626460 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626461 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626466 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626465 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626459 is new leader, mons np0005626459,np0005626461,np0005626460,np0005626466,np0005626465 in quorum (ranks 0,1,2,3,4) Feb 23 04:44:48 localhost ceph-mon[289530]: overall HEALTH_OK Feb 23 04:44:48 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:48 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626463@5(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:44:48 localhost ceph-mon[289530]: mgrc update_daemon_metadata mon.np0005626463 metadata {addrs=[v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable),ceph_version_short=18.2.1-381.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005626463.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005626463.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626459 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626460 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626461 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626466 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626465 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626463 calling monitor election Feb 23 04:44:48 localhost ceph-mon[289530]: mon.np0005626459 is new leader, mons np0005626459,np0005626461,np0005626460,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3,4,5) Feb 23 04:44:48 localhost ceph-mon[289530]: overall HEALTH_OK Feb 23 04:44:48 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:48 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:44:48 localhost nova_compute[282206]: 2026-02-23 09:44:48.878 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:48 localhost nova_compute[282206]: 2026-02-23 09:44:48.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:48 localhost nova_compute[282206]: 2026-02-23 09:44:48.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:48 localhost nova_compute[282206]: 2026-02-23 09:44:48.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:48 localhost nova_compute[282206]: 2026-02-23 09:44:48.909 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:48 localhost nova_compute[282206]: 2026-02-23 09:44:48.910 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:48 localhost podman[289848]: 2026-02-23 09:44:48.941697148 +0000 UTC m=+0.108784108 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216) Feb 23 04:44:48 localhost podman[289848]: 2026-02-23 09:44:48.977167836 +0000 UTC m=+0.144254796 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:44:48 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:44:48 localhost podman[289850]: 2026-02-23 09:44:48.989121602 +0000 UTC m=+0.155694925 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:44:48 localhost podman[289850]: 2026-02-23 09:44:48.997274362 +0000 UTC m=+0.163847715 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:44:49 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:44:49 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:49 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:49 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:49 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:49 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:49 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:44:49 localhost ceph-mon[289530]: Updating np0005626459.localdomain:/etc/ceph/ceph.conf Feb 23 04:44:49 localhost ceph-mon[289530]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf Feb 23 04:44:49 localhost ceph-mon[289530]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:44:49 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:44:49 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:44:49 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:44:51 localhost ceph-mon[289530]: Updating np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:44:51 localhost ceph-mon[289530]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:44:51 localhost ceph-mon[289530]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:44:51 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:44:51 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:44:51 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:44:51 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:44:51 localhost podman[290228]: 2026-02-23 09:44:51.750955532 +0000 UTC m=+0.078771307 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute) Feb 23 04:44:51 localhost podman[290228]: 2026-02-23 09:44:51.787215844 +0000 UTC m=+0.115031649 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0) Feb 23 04:44:51 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:44:52 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:44:52 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:52 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:52 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:52 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:44:53 localhost ceph-mon[289530]: Reconfiguring mon.np0005626459 (monmap changed)... Feb 23 04:44:53 localhost ceph-mon[289530]: Reconfiguring daemon mon.np0005626459 on np0005626459.localdomain Feb 23 04:44:53 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:53 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:53 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626459.pmtxxl", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:44:53 localhost nova_compute[282206]: 2026-02-23 09:44:53.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:53 localhost nova_compute[282206]: 2026-02-23 09:44:53.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:53 localhost nova_compute[282206]: 2026-02-23 09:44:53.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:53 localhost nova_compute[282206]: 2026-02-23 09:44:53.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:53 localhost nova_compute[282206]: 2026-02-23 09:44:53.954 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:53 localhost nova_compute[282206]: 2026-02-23 09:44:53.955 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:54 localhost ceph-mon[289530]: Reconfiguring mgr.np0005626459.pmtxxl (monmap changed)... Feb 23 04:44:54 localhost ceph-mon[289530]: Reconfiguring daemon mgr.np0005626459.pmtxxl on np0005626459.localdomain Feb 23 04:44:54 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:54 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:54 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626459", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:44:54 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:54 localhost sshd[290247]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:44:55 localhost ceph-mon[289530]: Reconfiguring crash.np0005626459 (monmap changed)... Feb 23 04:44:55 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626459 on np0005626459.localdomain Feb 23 04:44:55 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:55 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:55 localhost ceph-mon[289530]: Reconfiguring crash.np0005626460 (monmap changed)... Feb 23 04:44:55 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:44:55 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain Feb 23 04:44:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:44:55 localhost systemd[1]: tmp-crun.JFkVLa.mount: Deactivated successfully. Feb 23 04:44:55 localhost podman[290249]: 2026-02-23 09:44:55.9191001 +0000 UTC m=+0.089533307 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:44:55 localhost podman[290249]: 2026-02-23 09:44:55.924496376 +0000 UTC m=+0.094929573 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:44:55 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.139 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '027a236c-9661-4136-803c-165ba7fd8164', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.135122', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4f967c12-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '6238137ca37243738933269c579d870477fda1ad7cc992ca17b67c91e24bfd1f'}]}, 'timestamp': '2026-02-23 09:44:56.140389', '_unique_id': 'f0ee13f090ed4be298b84af87e7ed130'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.141 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.155 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.155 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89c3c41d-8857-4870-8de2-1fa858259751', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.143215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4f98cd8c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': '624d8d0631ebbfed8817452f34d8460c7fab70cc623556e03f4ae83c34b72957'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.143215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4f98e006-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': 'ddf55727eafa165a094b3246dcf9c6ef07590f8c4d97a2347e9058eeb9efe3aa'}]}, 'timestamp': '2026-02-23 09:44:56.155998', '_unique_id': 'cd38012e6ec846738ecd402f9495cda9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.157 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.186 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.186 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21185ca3-c333-452f-aba3-0676b763c79d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.158259', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4f9d93ee-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '7c828b2b36fde706adf33b45f1ad6ea48a16da44c97c87c418552fefced35fcb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.158259', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4f9da6a4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '65482ee50f59f07e88b2b6a0fc2030779ec71f7274bd011e381838304b6f879c'}]}, 'timestamp': '2026-02-23 09:44:56.187254', '_unique_id': '08a4e5c1e4cf4378aec3d8b316ef66e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.188 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.189 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.190 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c1d193d-c9d9-429c-a251-24f3d81baeb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.189660', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4f9e1684-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '99de27b17fa88bab74b5b42cb53f560a0dc373dd88e58cc9fc5b4c38b10ab410'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.189660', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4f9e26e2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '11a56b19fc4d15a83c4b0570169d3f389c74f19bd7d0a5442931eedac1aa0412'}]}, 'timestamp': '2026-02-23 09:44:56.190532', '_unique_id': '284b783755604eb4abfdbb419de336d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.192 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.193 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a63f45a-a00c-46ab-ba35-9e02af07d3a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.192702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4f9e8d1c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '7a93d3822e5bd45cc7c96920b1e01be2c8216e748203e8f1d4807371aada9de8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.192702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4f9e9d34-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': 'b3f1bbb0ab748e841eea288b36576f523f09f6a2e75828a2a208d2b48d7e517c'}]}, 'timestamp': '2026-02-23 09:44:56.193560', '_unique_id': '498f8f38041e47968df8b744381277f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.195 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0198b8c6-ad21-4c95-8f7c-a7280ef24fa4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.195705', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4f9f030a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '7c00f5ccdcb3a3e33f73221aab28c600137c6ce32d5451fb13aee6560ac8a55d'}]}, 'timestamp': '2026-02-23 09:44:56.196198', '_unique_id': '64a17d6a76ed47bcb7d9e5a5df774704'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.198 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e6f7d46-3abe-4fbb-a09c-8148e2b2a98b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:44:56.198598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4fa21234-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.404869499, 'message_signature': 'a4c0699b1a6d3ab768c56849fe0ad0667edcb71d1e82737af91c4e261ca758d7'}]}, 'timestamp': '2026-02-23 09:44:56.216242', '_unique_id': '31d3d42863964401b40820a346bff034'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.217 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8eff605-b157-48ac-a9ad-bb941c72addd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.218485', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa27b3e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': 'f052104f01dff4a6dd833077e53ade9ecf4c4302749d3f9c421070bf57cd9a62'}]}, 'timestamp': '2026-02-23 09:44:56.218971', '_unique_id': '7d80f1f4ab434eb0a8a9a61d97b2df37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.221 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.221 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '071cd756-3ad6-46e7-9d93-daa89f98a00e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.221062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa2dfac-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': 'b5f864e15c7fcf651dd7bbefb7be72077e76bd32d1a34722f5ca728491a6bb48'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.221062', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa2efa6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': 'b26cfa7273841f2695aa4995b1ae7ccc31c3ffaa9894a87190f2ec6d62a8f4f7'}]}, 'timestamp': '2026-02-23 09:44:56.221917', '_unique_id': '8ba661806dd74c4babafdc28500e588d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfeeaa2b-03bd-4776-acb9-b215b28cd252', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.224053', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa3546e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': 'ea5453b591eb4da59e0f16edcbcb7b54f57eebc488235700ebe0f445d79c63fd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.224053', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa3645e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': '619f70a5adb259d678c030a870f269e4c99ca602a769acf1e33801a0791d8184'}]}, 'timestamp': '2026-02-23 09:44:56.224901', '_unique_id': 'c61639a087b441f69baae7e0857f7aec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0ccd1ff-eac6-4c36-83c1-283874889efe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.226986', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa3c75a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '3695ee5012151e6142cbd1efb81621dfebaa24a3faef6b1935d3f0475d8bb3ad'}]}, 'timestamp': '2026-02-23 09:44:56.227437', '_unique_id': '006f166cfb024fd69bcc1d3d338aebc5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'baf26159-88af-4897-923e-20de5f972f55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.229522', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa42a60-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': 'af854dbb9a319ab7bd2fa3aaf0d0439eec8f3d0f178eb96b772fc9e819911aa8'}]}, 'timestamp': '2026-02-23 09:44:56.230003', '_unique_id': 'af2cafe9ad864e9e8f8b1307f6367468'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '088bdd9c-2a80-4795-8375-2cb1889161da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.232064', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa48d5c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '882661f88d1ea7ad5aeece9fe55a3416fc4528ad7f94d8e43a7ca4b0fb47b4fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.232064', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa49d24-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': 'f66ad6facf8cc35504be4777df2d4ec0ccbdf884d2f4e5aa2ce6e7b241a67620'}]}, 'timestamp': '2026-02-23 09:44:56.232913', '_unique_id': '32d4e88c4f1b44d7836272c2b89958e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 10260000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b841d89d-e494-4f06-861d-04bef9cf412d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10260000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:44:56.234995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4fa4ffe4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.404869499, 'message_signature': '01b14bd954b541621d36ed2d6c9b40c86808d69097f884f1abdbaf800cc1a824'}]}, 'timestamp': '2026-02-23 09:44:56.235423', '_unique_id': '9eaebfa4f36040a7b180162657fd79f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13806fb5-0dc7-4db7-a502-9620350eef91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.237479', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa56100-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '22c75f3f1338fdd512b4e954242bcd3e466f22d420a15c091a61e442459ca36d'}]}, 'timestamp': '2026-02-23 09:44:56.237952', '_unique_id': '29bf2857f9c04e94a31aad2c612e9147'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '603307f6-7801-462b-83ba-2828d6437478', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.239999', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa5c352-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': 'aaea731d43c1fc7b1b91a6accab32e176264c6b11461b56182baf4873d6e9343'}]}, 'timestamp': '2026-02-23 09:44:56.240443', '_unique_id': 'd579a034a6504cb88b079be6b3f9eb63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b87bb83c-2e4d-4299-a6c0-ab735509bfa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.242480', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa62450-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '338dcf86f374ea7de5109accf72dfa85bbf29768304adf31b4a5de4f7c9a5640'}]}, 'timestamp': '2026-02-23 09:44:56.242956', '_unique_id': 'f078e1a74846402dbece4a975be99fa2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ad0527-7c79-48e2-8bb1-d8f59fe1be87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.245257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa691a6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '168ab5211f2898a00d165cbdb97a8b6c20edc6a0c2817f84f1edb29ac2b35ce6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.245257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa6a394-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.347742767, 'message_signature': '23172a1295b73f18faf47a682f77db15d2193be86a01d112a61fe9eb05ca063a'}]}, 'timestamp': '2026-02-23 09:44:56.246171', '_unique_id': '22faf7c110844455b264b0fe1f60cb05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4786787-6780-4cb0-b626-defdbc977792', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.248441', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa70db6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '3b156e1a33d697ea648e8d79ea02ea96d887ec0180141353c60b97bceab80e2e'}]}, 'timestamp': '2026-02-23 09:44:56.248947', '_unique_id': '4fe362cd08534765bbca1c2479160ace'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e32d2a30-02d2-4ece-8c69-c216c5925185', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:44:56.251487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4fa78638-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': '13ad19c7d864c3e05e181dbb032b62cb9e5e608358891752d5671ecd81b08a73'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:44:56.251487', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4fa79ab0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.332700866, 'message_signature': 'b26dae3c3dd782f1885c6cfdc6eac93d265da57c0d98fc9fc30aaadb92e86ed6'}]}, 'timestamp': '2026-02-23 09:44:56.252419', '_unique_id': 'ce5fede0dab448a1a099cf3a7e30f368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59bd7020-e53a-4d01-ad23-2dbf0c542091', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:44:56.253736', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4fa7d99e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11336.324603247, 'message_signature': '0a0f23ec5c015bc153119034254a2052a3b117af88dc391a4bcbe660341c2016'}]}, 'timestamp': '2026-02-23 09:44:56.254033', '_unique_id': '40ba977ea52b437a8700ed614633a571'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:44:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:44:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:44:56 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:56 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:56 localhost ceph-mon[289530]: Reconfiguring mon.np0005626460 (monmap changed)... Feb 23 04:44:56 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:44:56 localhost ceph-mon[289530]: Reconfiguring daemon mon.np0005626460 on np0005626460.localdomain Feb 23 04:44:56 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:57 localhost ceph-mon[289530]: mon.np0005626463@5(peon) e6 handle_command mon_command({"prefix": "mgr fail"} v 0) Feb 23 04:44:57 localhost ceph-mon[289530]: log_channel(audit) log [INF] : from='client.? 172.18.0.103:0/2046273284' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:44:57 localhost ceph-mon[289530]: mon.np0005626463@5(peon).osd e80 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Feb 23 04:44:57 localhost ceph-mon[289530]: mon.np0005626463@5(peon).osd e80 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Feb 23 04:44:57 localhost ceph-mon[289530]: mon.np0005626463@5(peon).osd e81 e81: 6 total, 6 up, 6 in Feb 23 04:44:57 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:6800/920472675 Feb 23 04:44:57 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:6800/920472675 Feb 23 04:44:57 localhost systemd[1]: session-26.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd[1]: session-26.scope: Consumed 3min 34.340s CPU time. Feb 23 04:44:57 localhost systemd[1]: session-18.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd-logind[759]: Session 26 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd[1]: session-21.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd-logind[759]: Session 18 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd-logind[759]: Session 21 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd[1]: session-20.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd-logind[759]: Session 20 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 26. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 18. Feb 23 04:44:57 localhost systemd[1]: session-16.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd-logind[759]: Session 16 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd[1]: session-24.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd-logind[759]: Session 24 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd[1]: session-17.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd[1]: session-22.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd[1]: session-14.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd-logind[759]: Session 14 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd-logind[759]: Session 17 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd[1]: session-23.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd-logind[759]: Session 22 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd-logind[759]: Session 23 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 21. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 20. Feb 23 04:44:57 localhost systemd[1]: session-25.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd[1]: session-19.scope: Deactivated successfully. Feb 23 04:44:57 localhost systemd-logind[759]: Session 25 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd-logind[759]: Session 19 logged out. Waiting for processes to exit. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 16. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 24. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 17. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 22. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 14. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 23. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 25. Feb 23 04:44:57 localhost systemd-logind[759]: Removed session 19. Feb 23 04:44:57 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Feb 23 04:44:57 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' Feb 23 04:44:57 localhost ceph-mon[289530]: from='mgr.14120 172.18.0.103:0/3790934138' entity='mgr.np0005626459.pmtxxl' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:44:57 localhost ceph-mon[289530]: from='client.? 172.18.0.103:0/2046273284' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:44:57 localhost ceph-mon[289530]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:44:57 localhost ceph-mon[289530]: Activating manager daemon np0005626461.lrfquh Feb 23 04:44:57 localhost ceph-mon[289530]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 23 04:44:57 localhost ceph-mon[289530]: Manager daemon np0005626461.lrfquh is now available Feb 23 04:44:57 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626461.lrfquh/mirror_snapshot_schedule"} : dispatch Feb 23 04:44:57 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626461.lrfquh/mirror_snapshot_schedule"} : dispatch Feb 23 04:44:57 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626461.lrfquh/trash_purge_schedule"} : dispatch Feb 23 04:44:57 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626461.lrfquh/trash_purge_schedule"} : dispatch Feb 23 04:44:57 localhost sshd[290268]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:44:57 localhost systemd-logind[759]: New session 64 of user ceph-admin. Feb 23 04:44:57 localhost systemd[1]: Started Session 64 of User ceph-admin. Feb 23 04:44:58 localhost sshd[290330]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:44:58 localhost podman[290379]: 2026-02-23 09:44:58.577015373 +0000 UTC m=+0.087488793 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux ) Feb 23 04:44:58 localhost podman[290379]: 2026-02-23 09:44:58.67702081 +0000 UTC m=+0.187494220 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, RELEASE=main, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, release=1770267347) Feb 23 04:44:58 localhost nova_compute[282206]: 2026-02-23 09:44:58.956 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:58 localhost nova_compute[282206]: 2026-02-23 09:44:58.959 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:44:58 localhost nova_compute[282206]: 2026-02-23 09:44:58.959 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:44:58 localhost nova_compute[282206]: 2026-02-23 09:44:58.959 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:58 localhost nova_compute[282206]: 2026-02-23 09:44:58.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:44:59 localhost nova_compute[282206]: 2026-02-23 09:44:58.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:44:59 localhost ceph-mon[289530]: mon.np0005626463@5(peon).osd e81 _set_new_cache_sizes cache_size:1019728505 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:00 localhost ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Bus STARTING Feb 23 04:45:00 localhost ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Serving on https://172.18.0.105:7150 Feb 23 04:45:00 localhost ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Client ('172.18.0.105', 36526) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 23 04:45:00 localhost ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Serving on http://172.18.0.105:8765 Feb 23 04:45:00 localhost ceph-mon[289530]: [23/Feb/2026:09:44:59] ENGINE Bus STARTED Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:00 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626459", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626459", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 04:45:02 localhost ceph-mon[289530]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 04:45:02 localhost ceph-mon[289530]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 04:45:02 localhost ceph-mon[289530]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 04:45:02 localhost ceph-mon[289530]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:45:02 localhost ceph-mon[289530]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:45:02 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626459.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:03 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:04 localhost nova_compute[282206]: 2026-02-23 09:45:04.044 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:04 localhost nova_compute[282206]: 2026-02-23 09:45:04.046 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:04 localhost nova_compute[282206]: 2026-02-23 09:45:04.046 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5048 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:04 localhost nova_compute[282206]: 2026-02-23 09:45:04.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:04 localhost nova_compute[282206]: 2026-02-23 09:45:04.049 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:04 localhost nova_compute[282206]: 2026-02-23 09:45:04.050 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:04 localhost ceph-mon[289530]: mon.np0005626463@5(peon).osd e81 _set_new_cache_sizes cache_size:1020048086 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:04 localhost ceph-mon[289530]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:04 localhost ceph-mon[289530]: Updating np0005626460.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:45:04 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:45:04 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:45:04 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:45:04 localhost ceph-mon[289530]: Updating np0005626459.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:45:04 localhost ceph-mon[289530]: Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:45:05 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:45:05 localhost ceph-mon[289530]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:45:05 localhost ceph-mon[289530]: Updating np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:45:05 localhost ceph-mon[289530]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:45:05 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:45:05 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:05 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:06 localhost ceph-mon[289530]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)... Feb 23 04:45:06 localhost ceph-mon[289530]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain Feb 23 04:45:06 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:06 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:06 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:45:06 localhost systemd[1]: tmp-crun.PTrCHb.mount: Deactivated successfully. Feb 23 04:45:06 localhost podman[291279]: 2026-02-23 09:45:06.925051829 +0000 UTC m=+0.098943736 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:45:06 localhost podman[291279]: 2026-02-23 09:45:06.936224652 +0000 UTC m=+0.110116559 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:45:06 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:45:07 localhost ceph-mon[289530]: Reconfiguring mon.np0005626461 (monmap changed)... Feb 23 04:45:07 localhost ceph-mon[289530]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain Feb 23 04:45:07 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:07 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:07 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:07 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:07 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:08 localhost ceph-mon[289530]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:45:08 localhost ceph-mon[289530]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:45:08 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:08 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:08 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:08 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:09 localhost nova_compute[282206]: 2026-02-23 09:45:09.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:09 localhost nova_compute[282206]: 2026-02-23 09:45:09.055 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:09 localhost nova_compute[282206]: 2026-02-23 09:45:09.055 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:09 localhost nova_compute[282206]: 2026-02-23 09:45:09.055 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:09 localhost nova_compute[282206]: 2026-02-23 09:45:09.091 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:09 localhost nova_compute[282206]: 2026-02-23 09:45:09.091 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:09 localhost podman[242954]: time="2026-02-23T09:45:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:45:09 localhost ceph-mon[289530]: mon.np0005626463@5(peon).osd e81 _set_new_cache_sizes cache_size:1020054592 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:09 localhost podman[242954]: @ - - [23/Feb/2026:09:45:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 23 04:45:09 localhost podman[242954]: @ - - [23/Feb/2026:09:45:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18260 "" "Go-http-client/1.1" Feb 23 04:45:09 localhost ceph-mon[289530]: Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:45:09 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:45:09 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:09 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:09 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:09 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:10 localhost nova_compute[282206]: 2026-02-23 09:45:10.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:10 localhost podman[291356]: Feb 23 04:45:10 localhost nova_compute[282206]: 2026-02-23 09:45:10.077 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:10 localhost nova_compute[282206]: 2026-02-23 09:45:10.078 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:45:10 localhost nova_compute[282206]: 2026-02-23 09:45:10.078 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:45:10 localhost podman[291356]: 2026-02-23 09:45:10.083762499 +0000 UTC m=+0.082847141 container create 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux ) Feb 23 04:45:10 localhost systemd[1]: Started libpod-conmon-46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b.scope. Feb 23 04:45:10 localhost systemd[1]: Started libcrun container. Feb 23 04:45:10 localhost podman[291356]: 2026-02-23 09:45:10.047503317 +0000 UTC m=+0.046588059 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:10 localhost podman[291356]: 2026-02-23 09:45:10.161189054 +0000 UTC m=+0.160273696 container init 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.42.2, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7) Feb 23 04:45:10 localhost podman[291356]: 2026-02-23 09:45:10.171031235 +0000 UTC m=+0.170115887 container start 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1770267347, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:45:10 localhost podman[291356]: 2026-02-23 09:45:10.171305684 +0000 UTC m=+0.170390326 container attach 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, GIT_BRANCH=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True) Feb 23 04:45:10 localhost vibrant_allen[291371]: 167 167 Feb 23 04:45:10 localhost systemd[1]: libpod-46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b.scope: Deactivated successfully. Feb 23 04:45:10 localhost podman[291356]: 2026-02-23 09:45:10.176042189 +0000 UTC m=+0.175126881 container died 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, version=7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=) Feb 23 04:45:10 localhost podman[291376]: 2026-02-23 09:45:10.276206591 +0000 UTC m=+0.088684451 container remove 46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_allen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, name=rhceph, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.42.2) Feb 23 04:45:10 localhost systemd[1]: libpod-conmon-46b07131f3f026008a26459a2d4fb8153d1d43b1ae1d9969925a8a067634e29b.scope: Deactivated successfully. Feb 23 04:45:10 localhost nova_compute[282206]: 2026-02-23 09:45:10.718 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:45:10 localhost nova_compute[282206]: 2026-02-23 09:45:10.720 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:45:10 localhost nova_compute[282206]: 2026-02-23 09:45:10.720 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:45:10 localhost nova_compute[282206]: 2026-02-23 09:45:10.721 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:45:10 localhost ceph-mon[289530]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:45:10 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:45:10 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:10 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:10 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.857786) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839910857922, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 10990, "num_deletes": 525, "total_data_size": 15118282, "memory_usage": 15768304, "flush_reason": "Manual Compaction"} Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839910931901, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 10764182, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 10995, "table_properties": {"data_size": 10711520, "index_size": 27340, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24261, "raw_key_size": 255183, "raw_average_key_size": 26, "raw_value_size": 10548182, "raw_average_value_size": 1087, "num_data_blocks": 1030, "num_entries": 9701, "num_filter_entries": 9701, "num_deletions": 524, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839879, "oldest_key_time": 1771839879, "file_creation_time": 1771839910, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3d1e4b58-ab15-4081-a9da-984e46fdc8b2", "db_session_id": "5E7LZX0BBD5RWYSEID7U", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 74249 microseconds, and 23974 cpu microseconds. Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.932042) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 10764182 bytes OK Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.932097) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.933971) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.934001) EVENT_LOG_v1 {"time_micros": 1771839910933992, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.934025) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 15044043, prev total WAL file size 15044043, number of live WAL files 2. Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.937474) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end) Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(2012B)] Feb 23 04:45:10 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839910937576, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 10766194, "oldest_snapshot_seqno": -1} Feb 23 04:45:11 localhost podman[291445]: Feb 23 04:45:11 localhost podman[291445]: 2026-02-23 09:45:11.031823035 +0000 UTC m=+0.061767366 container create 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , RELEASE=main, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=) Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9181 keys, 10756894 bytes, temperature: kUnknown Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839911033698, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 10756894, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10705533, "index_size": 27324, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22981, "raw_key_size": 246612, "raw_average_key_size": 26, "raw_value_size": 10548837, "raw_average_value_size": 1148, "num_data_blocks": 1028, "num_entries": 9181, "num_filter_entries": 9181, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839879, "oldest_key_time": 0, "file_creation_time": 1771839910, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3d1e4b58-ab15-4081-a9da-984e46fdc8b2", "db_session_id": "5E7LZX0BBD5RWYSEID7U", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:11.034070) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 10756894 bytes Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:11.036534) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 111.9 rd, 111.8 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.3, 0.0 +0.0 blob) out(10.3 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 9706, records dropped: 525 output_compression: NoCompression Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:11.036564) EVENT_LOG_v1 {"time_micros": 1771839911036552, "job": 4, "event": "compaction_finished", "compaction_time_micros": 96238, "compaction_time_cpu_micros": 30422, "output_level": 6, "num_output_files": 1, "total_output_size": 10756894, "num_input_records": 9706, "num_output_records": 9181, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839911038191, "job": 4, "event": "table_file_deletion", "file_number": 14} Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839911038299, "job": 4, "event": "table_file_deletion", "file_number": 8} Feb 23 04:45:11 localhost ceph-mon[289530]: rocksdb: (Original Log Time 2026/02/23-09:45:10.937339) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:45:11 localhost systemd[1]: Started libpod-conmon-23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9.scope. Feb 23 04:45:11 localhost systemd[1]: Started libcrun container. Feb 23 04:45:11 localhost podman[291445]: 2026-02-23 09:45:11.086390608 +0000 UTC m=+0.116334939 container init 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph) Feb 23 04:45:11 localhost systemd[1]: var-lib-containers-storage-overlay-f41b365428462e5c4856888ece45a6a6a8335094acb2c9b1536847a1ebabc2b9-merged.mount: Deactivated successfully. Feb 23 04:45:11 localhost podman[291445]: 2026-02-23 09:45:10.996380097 +0000 UTC m=+0.026324528 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:11 localhost podman[291445]: 2026-02-23 09:45:11.09917051 +0000 UTC m=+0.129114881 container start 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Feb 23 04:45:11 localhost podman[291445]: 2026-02-23 09:45:11.099481379 +0000 UTC m=+0.129425740 container attach 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=) Feb 23 04:45:11 localhost intelligent_nightingale[291461]: 167 167 Feb 23 04:45:11 localhost systemd[1]: libpod-23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9.scope: Deactivated successfully. Feb 23 04:45:11 localhost podman[291445]: 2026-02-23 09:45:11.102147061 +0000 UTC m=+0.132091472 container died 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, name=rhceph, ceph=True, build-date=2026-02-09T10:25:24Z) Feb 23 04:45:11 localhost nova_compute[282206]: 2026-02-23 09:45:11.173 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:45:11 localhost systemd[1]: var-lib-containers-storage-overlay-621224875a7cb1321f8e4b66ef64452fcc1770bba9939697bb6eb427bba3927f-merged.mount: Deactivated successfully. Feb 23 04:45:11 localhost ceph-mon[289530]: mon.np0005626463@5(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:45:11 localhost ceph-mon[289530]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1266158925' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:45:11 localhost nova_compute[282206]: 2026-02-23 09:45:11.188 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:45:11 localhost nova_compute[282206]: 2026-02-23 09:45:11.188 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:45:11 localhost podman[291466]: 2026-02-23 09:45:11.19536536 +0000 UTC m=+0.084351888 container remove 23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_nightingale, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.buildah.version=1.42.2, GIT_BRANCH=main, ceph=True, release=1770267347, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vendor=Red Hat, Inc.) Feb 23 04:45:11 localhost systemd[1]: libpod-conmon-23d0be7b244400625106b2a6dbba6bc439ee78f6ead2263061c28fb191ed25d9.scope: Deactivated successfully. Feb 23 04:45:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:45:11 localhost podman[291507]: 2026-02-23 09:45:11.593043186 +0000 UTC m=+0.089096214 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.7, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 23 04:45:11 localhost podman[291507]: 2026-02-23 09:45:11.612281556 +0000 UTC m=+0.108334604 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc.) Feb 23 04:45:11 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:45:11 localhost ceph-mon[289530]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:45:11 localhost ceph-mon[289530]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:45:11 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:11 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:11 localhost ceph-mon[289530]: Reconfiguring osd.5 (monmap changed)... Feb 23 04:45:11 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:45:11 localhost ceph-mon[289530]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:45:11 localhost podman[291560]: Feb 23 04:45:12 localhost podman[291560]: 2026-02-23 09:45:12.001644367 +0000 UTC m=+0.069899625 container create 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , ceph=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7) Feb 23 04:45:12 localhost systemd[1]: Started libpod-conmon-171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350.scope. Feb 23 04:45:12 localhost nova_compute[282206]: 2026-02-23 09:45:12.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:12 localhost systemd[1]: Started libcrun container. Feb 23 04:45:12 localhost podman[291560]: 2026-02-23 09:45:11.968698986 +0000 UTC m=+0.036954284 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:12 localhost podman[291560]: 2026-02-23 09:45:12.076021887 +0000 UTC m=+0.144277145 container init 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main) Feb 23 04:45:12 localhost podman[291560]: 2026-02-23 09:45:12.083908189 +0000 UTC m=+0.152163437 container start 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:45:12 localhost podman[291560]: 2026-02-23 09:45:12.084141546 +0000 UTC m=+0.152396804 container attach 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vcs-type=git, release=1770267347, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=) Feb 23 04:45:12 localhost reverent_mcclintock[291575]: 167 167 Feb 23 04:45:12 localhost systemd[1]: libpod-171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350.scope: Deactivated successfully. Feb 23 04:45:12 localhost podman[291560]: 2026-02-23 09:45:12.088581173 +0000 UTC m=+0.156836471 container died 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:45:12 localhost systemd[1]: var-lib-containers-storage-overlay-1d79617f1932691bbe7c3efa34c0e333cb446b5094a5769143e962411d5d39a6-merged.mount: Deactivated successfully. Feb 23 04:45:12 localhost podman[291580]: 2026-02-23 09:45:12.194723778 +0000 UTC m=+0.092588130 container remove 171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_mcclintock, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.42.2, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:45:12 localhost systemd[1]: libpod-conmon-171e99f84cf07980589966b36cb6c9cc81b646d5d350d8090c92acf998e21350.scope: Deactivated successfully. Feb 23 04:45:12 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e51e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0 Feb 23 04:45:12 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0 Feb 23 04:45:12 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0 Feb 23 04:45:12 localhost ceph-mon[289530]: mon.np0005626463@5(peon) e7 my rank is now 4 (was 5) Feb 23 04:45:12 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5600 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0 Feb 23 04:45:12 localhost ceph-mon[289530]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:45:12 localhost ceph-mon[289530]: paxos.4).electionLogic(26) init, last seen epoch 26 Feb 23 04:45:12 localhost ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:45:12 localhost ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:45:13 localhost podman[291656]: Feb 23 04:45:13 localhost podman[291656]: 2026-02-23 09:45:13.025171896 +0000 UTC m=+0.073232137 container create 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, RELEASE=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:45:13 localhost ceph-mds[286877]: --2- [v2:172.18.0.106:6808/2515508693,v1:172.18.0.106:6809/2515508693] >> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] conn(0x5634b1281c00 0x5634b1423600 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Feb 23 04:45:13 localhost nova_compute[282206]: 2026-02-23 09:45:13.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:13 localhost nova_compute[282206]: 2026-02-23 09:45:13.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:13 localhost systemd[1]: Started libpod-conmon-5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d.scope. Feb 23 04:45:13 localhost systemd[1]: Started libcrun container. Feb 23 04:45:13 localhost podman[291656]: 2026-02-23 09:45:13.091060267 +0000 UTC m=+0.139120518 container init 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, release=1770267347, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True) Feb 23 04:45:13 localhost podman[291656]: 2026-02-23 09:45:12.997184087 +0000 UTC m=+0.045244388 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:13 localhost systemd[1]: tmp-crun.cWfQTV.mount: Deactivated successfully. Feb 23 04:45:13 localhost podman[291656]: 2026-02-23 09:45:13.104191049 +0000 UTC m=+0.152251260 container start 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347) Feb 23 04:45:13 localhost podman[291656]: 2026-02-23 09:45:13.104492029 +0000 UTC m=+0.152552310 container attach 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, vendor=Red Hat, Inc., name=rhceph, release=1770267347, RELEASE=main, version=7, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:45:13 localhost bold_agnesi[291672]: 167 167 Feb 23 04:45:13 localhost systemd[1]: libpod-5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d.scope: Deactivated successfully. Feb 23 04:45:13 localhost podman[291656]: 2026-02-23 09:45:13.107988496 +0000 UTC m=+0.156048797 container died 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, release=1770267347, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, distribution-scope=public) Feb 23 04:45:13 localhost systemd[1]: var-lib-containers-storage-overlay-e7744f9b1f0b5c7c8419c1aa6a71833428ba30aa2f214a925005d33bf6b896e4-merged.mount: Deactivated successfully. Feb 23 04:45:13 localhost podman[291677]: 2026-02-23 09:45:13.19484957 +0000 UTC m=+0.075041353 container remove 5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_agnesi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, RELEASE=main, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:45:13 localhost systemd[1]: libpod-conmon-5a151d06aa8018544854c43808ee3445256ba5cdc4844925c19f07aa8e54648d.scope: Deactivated successfully. Feb 23 04:45:13 localhost openstack_network_exporter[245358]: ERROR 09:45:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:45:13 localhost openstack_network_exporter[245358]: Feb 23 04:45:13 localhost openstack_network_exporter[245358]: ERROR 09:45:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:45:13 localhost openstack_network_exporter[245358]: Feb 23 04:45:14 localhost nova_compute[282206]: 2026-02-23 09:45:14.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:14 localhost nova_compute[282206]: 2026-02-23 09:45:14.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:14 localhost nova_compute[282206]: 2026-02-23 09:45:14.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:14 localhost nova_compute[282206]: 2026-02-23 09:45:14.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:45:14 localhost nova_compute[282206]: 2026-02-23 09:45:14.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:14 localhost nova_compute[282206]: 2026-02-23 09:45:14.093 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:15 localhost nova_compute[282206]: 2026-02-23 09:45:15.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:15 localhost nova_compute[282206]: 2026-02-23 09:45:15.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:45:15 localhost nova_compute[282206]: 2026-02-23 09:45:15.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:45:15 localhost nova_compute[282206]: 2026-02-23 09:45:15.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:45:15 localhost nova_compute[282206]: 2026-02-23 09:45:15.075 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:45:15 localhost nova_compute[282206]: 2026-02-23 09:45:15.076 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:45:17 localhost ceph-mon[289530]: mon.np0005626463@4(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:45:17 localhost ceph-mon[289530]: mon.np0005626466 calling monitor election Feb 23 04:45:17 localhost ceph-mon[289530]: mon.np0005626461 calling monitor election Feb 23 04:45:17 localhost ceph-mon[289530]: mon.np0005626460 calling monitor election Feb 23 04:45:17 localhost ceph-mon[289530]: mon.np0005626463 calling monitor election Feb 23 04:45:17 localhost ceph-mon[289530]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626463 in quorum (ranks 0,1,2,4) Feb 23 04:45:17 localhost ceph-mon[289530]: Health check failed: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463 (MON_DOWN) Feb 23 04:45:17 localhost ceph-mon[289530]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463 Feb 23 04:45:17 localhost ceph-mon[289530]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463 Feb 23 04:45:17 localhost ceph-mon[289530]: mon.np0005626465 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum) Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.505 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.584 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.585 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.787 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.789 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11862MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.789 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.790 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.871 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.872 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.872 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626463@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626463@4(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:45:18 localhost nova_compute[282206]: 2026-02-23 09:45:18.919 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:45:18 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:45:18 localhost ceph-mon[289530]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:45:18 localhost ceph-mon[289530]: Remove daemons mon.np0005626459 Feb 23 04:45:18 localhost ceph-mon[289530]: Safe to remove mon.np0005626459: new quorum should be ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465', 'np0005626463'] (from ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465', 'np0005626463']) Feb 23 04:45:18 localhost ceph-mon[289530]: Removing monitor np0005626459 from monmap... Feb 23 04:45:18 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon rm", "name": "np0005626459"} : dispatch Feb 23 04:45:18 localhost ceph-mon[289530]: Removing daemon mon.np0005626459 from np0005626459.localdomain -- ports [] Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626465 calling monitor election Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626460 calling monitor election Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626461 calling monitor election Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626466 calling monitor election Feb 23 04:45:18 localhost ceph-mon[289530]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3,4) Feb 23 04:45:18 localhost ceph-mon[289530]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463) Feb 23 04:45:18 localhost ceph-mon[289530]: Cluster is now healthy Feb 23 04:45:18 localhost ceph-mon[289530]: overall HEALTH_OK Feb 23 04:45:18 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:18 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:18 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:18 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.094 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.097 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.097 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.097 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.140 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.140 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:19 localhost systemd[1]: tmp-crun.dQZbS7.mount: Deactivated successfully. Feb 23 04:45:19 localhost podman[291755]: 2026-02-23 09:45:19.189012678 +0000 UTC m=+0.097525362 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:45:19 localhost podman[291754]: 2026-02-23 09:45:19.238969229 +0000 UTC m=+0.147384180 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:45:19 localhost podman[291754]: 2026-02-23 09:45:19.271087575 +0000 UTC m=+0.179502526 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2) Feb 23 04:45:19 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:45:19 localhost podman[291755]: 2026-02-23 09:45:19.323313756 +0000 UTC m=+0.231826470 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:45:19 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.382 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.388 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:45:19 localhost ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.409 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.412 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:45:19 localhost nova_compute[282206]: 2026-02-23 09:45:19.412 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:45:19 localhost podman[291840]: Feb 23 04:45:19 localhost podman[291840]: 2026-02-23 09:45:19.614492886 +0000 UTC m=+0.075939180 container create ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, release=1770267347, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.42.2) Feb 23 04:45:19 localhost systemd[1]: Started libpod-conmon-ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59.scope. Feb 23 04:45:19 localhost systemd[1]: Started libcrun container. Feb 23 04:45:19 localhost podman[291840]: 2026-02-23 09:45:19.583490585 +0000 UTC m=+0.044936929 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:19 localhost podman[291840]: 2026-02-23 09:45:19.684246995 +0000 UTC m=+0.145693289 container init ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=1770267347, build-date=2026-02-09T10:25:24Z, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:45:19 localhost podman[291840]: 2026-02-23 09:45:19.694005725 +0000 UTC m=+0.155452029 container start ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z) Feb 23 04:45:19 localhost podman[291840]: 2026-02-23 09:45:19.694336115 +0000 UTC m=+0.155782409 container attach ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, release=1770267347, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:45:19 localhost inspiring_euler[291855]: 167 167 Feb 23 04:45:19 localhost systemd[1]: libpod-ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59.scope: Deactivated successfully. Feb 23 04:45:19 localhost podman[291840]: 2026-02-23 09:45:19.698116121 +0000 UTC m=+0.159562445 container died ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, distribution-scope=public, release=1770267347) Feb 23 04:45:19 localhost podman[291860]: 2026-02-23 09:45:19.792711432 +0000 UTC m=+0.082932115 container remove ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_euler, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, version=7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:45:19 localhost systemd[1]: libpod-conmon-ece168ae9d19c0d16a7c7aa48ef68dc0c70ce57c2272934c9a449bd13f93ba59.scope: Deactivated successfully. Feb 23 04:45:19 localhost ceph-mon[289530]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:45:19 localhost ceph-mon[289530]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:45:19 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:19 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:19 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:45:20 localhost systemd[1]: var-lib-containers-storage-overlay-687799a5ff8a64bd19365c3174285223d5ee6279077a6b09507ea30c065cd6a7-merged.mount: Deactivated successfully. Feb 23 04:45:20 localhost nova_compute[282206]: 2026-02-23 09:45:20.412 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:45:20 localhost podman[291930]: Feb 23 04:45:20 localhost podman[291930]: 2026-02-23 09:45:20.507861514 +0000 UTC m=+0.074755103 container create bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-type=git, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:45:20 localhost systemd[1]: Started libpod-conmon-bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f.scope. Feb 23 04:45:20 localhost systemd[1]: Started libcrun container. Feb 23 04:45:20 localhost podman[291930]: 2026-02-23 09:45:20.57032527 +0000 UTC m=+0.137218859 container init bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, name=rhceph, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public) Feb 23 04:45:20 localhost podman[291930]: 2026-02-23 09:45:20.476980277 +0000 UTC m=+0.043873906 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:20 localhost podman[291930]: 2026-02-23 09:45:20.581760131 +0000 UTC m=+0.148653720 container start bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, io.buildah.version=1.42.2, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:45:20 localhost podman[291930]: 2026-02-23 09:45:20.582186494 +0000 UTC m=+0.149080083 container attach bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.42.2, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container) Feb 23 04:45:20 localhost naughty_borg[291945]: 167 167 Feb 23 04:45:20 localhost systemd[1]: libpod-bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f.scope: Deactivated successfully. Feb 23 04:45:20 localhost podman[291930]: 2026-02-23 09:45:20.586460644 +0000 UTC m=+0.153354303 container died bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.buildah.version=1.42.2, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, release=1770267347, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:45:20 localhost podman[291950]: 2026-02-23 09:45:20.677204198 +0000 UTC m=+0.077079656 container remove bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_borg, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, io.openshift.expose-services=) Feb 23 04:45:20 localhost systemd[1]: libpod-conmon-bc62ecad100ea333a4c432185ae93a0a17d75811a22bf97cb7aa26b26a03a47f.scope: Deactivated successfully. Feb 23 04:45:21 localhost systemd[1]: var-lib-containers-storage-overlay-b20f6591a2a87ff9dfdddb000e9d30f7310c535f9d9755037bdd6534fcf26088-merged.mount: Deactivated successfully. Feb 23 04:45:21 localhost ceph-mon[289530]: Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:45:21 localhost ceph-mon[289530]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:45:21 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:21 localhost ceph-mon[289530]: Removed label mon from host np0005626459.localdomain Feb 23 04:45:21 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:21 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:21 localhost ceph-mon[289530]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:45:21 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:21 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:21 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:45:22 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:22 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:22 localhost ceph-mon[289530]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:45:22 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:45:22 localhost ceph-mon[289530]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:45:22 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:22 localhost ceph-mon[289530]: Removed label mgr from host np0005626459.localdomain Feb 23 04:45:22 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:22 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:22 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:45:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:45:22 localhost podman[291966]: 2026-02-23 09:45:22.922395423 +0000 UTC m=+0.092600821 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute) Feb 23 04:45:22 localhost podman[291966]: 2026-02-23 09:45:22.938427675 +0000 UTC m=+0.108633093 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 04:45:22 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:45:23 localhost ceph-mon[289530]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:45:23 localhost ceph-mon[289530]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:45:23 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:23 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:23 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:23 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:45:23 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:45:24 localhost nova_compute[282206]: 2026-02-23 09:45:24.179 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:24 localhost nova_compute[282206]: 2026-02-23 09:45:24.182 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:24 localhost ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:24 localhost ceph-mon[289530]: Removed label _admin from host np0005626459.localdomain Feb 23 04:45:24 localhost ceph-mon[289530]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:45:24 localhost ceph-mon[289530]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:45:24 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:24 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:24 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:24 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:25 localhost ceph-mon[289530]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:45:25 localhost ceph-mon[289530]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:45:25 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:25 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:25 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:45:26 localhost ceph-mon[289530]: Reconfiguring mon.np0005626465 (monmap changed)... Feb 23 04:45:26 localhost ceph-mon[289530]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:45:26 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:26 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:26 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:26 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:45:26 localhost podman[291985]: 2026-02-23 09:45:26.918826196 +0000 UTC m=+0.087410702 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:45:26 localhost podman[291985]: 2026-02-23 09:45:26.954391566 +0000 UTC m=+0.122976062 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:45:26 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:45:27 localhost ceph-mon[289530]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:45:27 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:45:27 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:27 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:27 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:45:28 localhost sshd[292001]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:45:28 localhost ceph-mon[289530]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:45:28 localhost ceph-mon[289530]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:45:28 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:28 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:28 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:45:29 localhost nova_compute[282206]: 2026-02-23 09:45:29.184 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:29 localhost nova_compute[282206]: 2026-02-23 09:45:29.186 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:29 localhost nova_compute[282206]: 2026-02-23 09:45:29.186 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:29 localhost nova_compute[282206]: 2026-02-23 09:45:29.186 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:29 localhost nova_compute[282206]: 2026-02-23 09:45:29.206 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:29 localhost nova_compute[282206]: 2026-02-23 09:45:29.207 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:29 localhost ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:29 localhost ceph-mon[289530]: Reconfiguring osd.4 (monmap changed)... Feb 23 04:45:29 localhost ceph-mon[289530]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:45:29 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:29 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:29 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:45:29 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:45:30 localhost ceph-mon[289530]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:45:30 localhost ceph-mon[289530]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:45:30 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:30 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:30 localhost ceph-mon[289530]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:45:30 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:30 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:30 localhost ceph-mon[289530]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:45:30 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:30 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:30 localhost ceph-mon[289530]: Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:45:30 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:45:30 localhost ceph-mon[289530]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:45:32 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:32 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:33 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:33 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:33 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:45:33 localhost ceph-mon[289530]: Removing np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:33 localhost ceph-mon[289530]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:33 localhost ceph-mon[289530]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:33 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:33 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:33 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:45:33 localhost ceph-mon[289530]: Removing np0005626459.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:45:33 localhost ceph-mon[289530]: Removing np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:45:33 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:33 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:34 localhost ceph-mon[289530]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:34 localhost ceph-mon[289530]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:34 localhost ceph-mon[289530]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:34 localhost ceph-mon[289530]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:34 localhost ceph-mon[289530]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:45:34 localhost nova_compute[282206]: 2026-02-23 09:45:34.208 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:34 localhost nova_compute[282206]: 2026-02-23 09:45:34.210 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:34 localhost nova_compute[282206]: 2026-02-23 09:45:34.210 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:34 localhost nova_compute[282206]: 2026-02-23 09:45:34.210 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:34 localhost nova_compute[282206]: 2026-02-23 09:45:34.235 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:34 localhost nova_compute[282206]: 2026-02-23 09:45:34.236 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:34 localhost ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: Added label _no_schedule to host np0005626459.localdomain Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626459.localdomain Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:35 localhost ceph-mon[289530]: Removing daemon crash.np0005626459 from np0005626459.localdomain -- ports [] Feb 23 04:45:36 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "client.crash.np0005626459"} : dispatch Feb 23 04:45:36 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "client.crash.np0005626459"} : dispatch Feb 23 04:45:36 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005626459"}]': finished Feb 23 04:45:36 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:45:37 localhost ceph-mon[289530]: Removing key for client.crash.np0005626459 Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:37 localhost ceph-mon[289530]: Removing daemon mgr.np0005626459.pmtxxl from np0005626459.localdomain -- ports [9283, 8765] Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"} : dispatch Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"} : dispatch Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"}]': finished Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"} : dispatch Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"} : dispatch Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"}]': finished Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:37 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:37 localhost podman[292340]: 2026-02-23 09:45:37.820985361 +0000 UTC m=+0.092214518 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:45:37 localhost podman[292340]: 2026-02-23 09:45:37.835657612 +0000 UTC m=+0.106886769 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:45:37 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:45:38 localhost ceph-mon[289530]: Removed host np0005626459.localdomain Feb 23 04:45:38 localhost ceph-mon[289530]: Removing key for mgr.np0005626459.pmtxxl Feb 23 04:45:38 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:45:38 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:38 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:38 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:39 localhost nova_compute[282206]: 2026-02-23 09:45:39.237 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:39 localhost nova_compute[282206]: 2026-02-23 09:45:39.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:39 localhost nova_compute[282206]: 2026-02-23 09:45:39.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:39 localhost nova_compute[282206]: 2026-02-23 09:45:39.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:39 localhost nova_compute[282206]: 2026-02-23 09:45:39.261 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:39 localhost nova_compute[282206]: 2026-02-23 09:45:39.261 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:39 localhost sshd[292381]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:45:39 localhost podman[242954]: time="2026-02-23T09:45:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:45:39 localhost ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:39 localhost podman[242954]: @ - - [23/Feb/2026:09:45:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1" Feb 23 04:45:39 localhost podman[242954]: @ - - [23/Feb/2026:09:45:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18267 "" "Go-http-client/1.1" Feb 23 04:45:39 localhost systemd[1]: Created slice User Slice of UID 1003. Feb 23 04:45:39 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Feb 23 04:45:39 localhost systemd-logind[759]: New session 65 of user tripleo-admin. Feb 23 04:45:39 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Feb 23 04:45:39 localhost systemd[1]: Starting User Manager for UID 1003... Feb 23 04:45:39 localhost systemd[292385]: Queued start job for default target Main User Target. Feb 23 04:45:39 localhost systemd[292385]: Created slice User Application Slice. Feb 23 04:45:39 localhost systemd[292385]: Started Mark boot as successful after the user session has run 2 minutes. Feb 23 04:45:39 localhost systemd[292385]: Started Daily Cleanup of User's Temporary Directories. Feb 23 04:45:39 localhost systemd[292385]: Reached target Paths. Feb 23 04:45:39 localhost systemd[292385]: Reached target Timers. Feb 23 04:45:39 localhost systemd[292385]: Starting D-Bus User Message Bus Socket... Feb 23 04:45:39 localhost systemd[292385]: Starting Create User's Volatile Files and Directories... Feb 23 04:45:39 localhost systemd[292385]: Finished Create User's Volatile Files and Directories. Feb 23 04:45:39 localhost systemd[292385]: Listening on D-Bus User Message Bus Socket. Feb 23 04:45:39 localhost systemd[292385]: Reached target Sockets. Feb 23 04:45:39 localhost systemd[292385]: Reached target Basic System. Feb 23 04:45:39 localhost systemd[292385]: Reached target Main User Target. Feb 23 04:45:39 localhost systemd[292385]: Startup finished in 152ms. Feb 23 04:45:39 localhost systemd[1]: Started User Manager for UID 1003. Feb 23 04:45:39 localhost systemd[1]: Started Session 65 of User tripleo-admin. Feb 23 04:45:39 localhost ceph-mon[289530]: Reconfiguring crash.np0005626460 (monmap changed)... Feb 23 04:45:39 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain Feb 23 04:45:39 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:39 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:39 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:45:40 localhost python3[292527]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line= - ip_netmask: 172.18.0.103/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Feb 23 04:45:40 localhost ceph-mon[289530]: Reconfiguring mon.np0005626460 (monmap changed)... Feb 23 04:45:40 localhost ceph-mon[289530]: Reconfiguring daemon mon.np0005626460 on np0005626460.localdomain Feb 23 04:45:40 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:40 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:40 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:40 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:41 localhost python3[292673]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.103/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:45:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:45:41 localhost ceph-mon[289530]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)... Feb 23 04:45:41 localhost ceph-mon[289530]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain Feb 23 04:45:41 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:41 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:41 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:45:41 localhost systemd[1]: tmp-crun.aL7tzr.mount: Deactivated successfully. Feb 23 04:45:41 localhost podman[292818]: 2026-02-23 09:45:41.872082161 +0000 UTC m=+0.098835752 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., release=1770267347) Feb 23 04:45:41 localhost podman[292818]: 2026-02-23 09:45:41.886006659 +0000 UTC m=+0.112760220 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, io.openshift.tags=minimal rhel9, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:45:41 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:45:41 localhost python3[292819]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.103 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 04:45:42 localhost ceph-mon[289530]: Reconfiguring mon.np0005626461 (monmap changed)... Feb 23 04:45:42 localhost ceph-mon[289530]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain Feb 23 04:45:42 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:42 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:42 localhost ceph-mon[289530]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:45:42 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:42 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:45:42 localhost ceph-mon[289530]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:45:42 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:43 localhost openstack_network_exporter[245358]: ERROR 09:45:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:45:43 localhost openstack_network_exporter[245358]: Feb 23 04:45:43 localhost openstack_network_exporter[245358]: ERROR 09:45:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:45:43 localhost openstack_network_exporter[245358]: Feb 23 04:45:44 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:44 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:44 localhost ceph-mon[289530]: Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:45:44 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:44 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:44 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:45:44 localhost nova_compute[282206]: 2026-02-23 09:45:44.262 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:44 localhost nova_compute[282206]: 2026-02-23 09:45:44.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:44 localhost nova_compute[282206]: 2026-02-23 09:45:44.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:44 localhost nova_compute[282206]: 2026-02-23 09:45:44.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:44 localhost nova_compute[282206]: 2026-02-23 09:45:44.302 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:44 localhost nova_compute[282206]: 2026-02-23 09:45:44.303 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:44 localhost ceph-mon[289530]: mon.np0005626463@4(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:45:44 localhost podman[292914]: Feb 23 04:45:44 localhost podman[292914]: 2026-02-23 09:45:44.847499499 +0000 UTC m=+0.065518351 container create 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, distribution-scope=public, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, release=1770267347) Feb 23 04:45:44 localhost systemd[1]: Started libpod-conmon-416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7.scope. Feb 23 04:45:44 localhost systemd[1]: Started libcrun container. Feb 23 04:45:44 localhost podman[292914]: 2026-02-23 09:45:44.815839847 +0000 UTC m=+0.033858719 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:44 localhost podman[292914]: 2026-02-23 09:45:44.924989174 +0000 UTC m=+0.143008036 container init 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, version=7, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Feb 23 04:45:44 localhost peaceful_carson[292929]: 167 167 Feb 23 04:45:44 localhost systemd[1]: libpod-416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7.scope: Deactivated successfully. Feb 23 04:45:44 localhost podman[292914]: 2026-02-23 09:45:44.935696463 +0000 UTC m=+0.153715315 container start 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, version=7) Feb 23 04:45:44 localhost podman[292914]: 2026-02-23 09:45:44.936049484 +0000 UTC m=+0.154068336 container attach 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True) Feb 23 04:45:44 localhost podman[292914]: 2026-02-23 09:45:44.937746346 +0000 UTC m=+0.155765248 container died 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, architecture=x86_64, release=1770267347, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc.) Feb 23 04:45:45 localhost podman[292934]: 2026-02-23 09:45:45.031628905 +0000 UTC m=+0.083277674 container remove 416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=peaceful_carson, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_BRANCH=main, release=1770267347, CEPH_POINT_RELEASE=) Feb 23 04:45:45 localhost systemd[1]: libpod-conmon-416e2f1045834857bdfc425f4c31d00781e42ec02862a5befac76fb105b708a7.scope: Deactivated successfully. Feb 23 04:45:45 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:45 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:45 localhost ceph-mon[289530]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:45:45 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:45 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:45:45 localhost ceph-mon[289530]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:45:45 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:45 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:45 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:45:45 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:45 localhost podman[293002]: Feb 23 04:45:45 localhost podman[293002]: 2026-02-23 09:45:45.72862664 +0000 UTC m=+0.074382251 container create 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc.) Feb 23 04:45:45 localhost systemd[1]: Started libpod-conmon-70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9.scope. Feb 23 04:45:45 localhost systemd[1]: Started libcrun container. Feb 23 04:45:45 localhost podman[293002]: 2026-02-23 09:45:45.787996081 +0000 UTC m=+0.133751692 container init 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:45:45 localhost dreamy_davinci[293017]: 167 167 Feb 23 04:45:45 localhost podman[293002]: 2026-02-23 09:45:45.799135173 +0000 UTC m=+0.144890774 container start 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:45:45 localhost podman[293002]: 2026-02-23 09:45:45.700307542 +0000 UTC m=+0.046063193 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:45 localhost podman[293002]: 2026-02-23 09:45:45.799628759 +0000 UTC m=+0.145384370 container attach 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:45:45 localhost systemd[1]: libpod-70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9.scope: Deactivated successfully. Feb 23 04:45:45 localhost podman[293002]: 2026-02-23 09:45:45.801724542 +0000 UTC m=+0.147480213 container died 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2) Feb 23 04:45:45 localhost systemd[1]: var-lib-containers-storage-overlay-e05ef13dc817c465026bee3c6ea3b4ca4614671ade21d6fa8fa9378a61130c4f-merged.mount: Deactivated successfully. Feb 23 04:45:45 localhost systemd[1]: tmp-crun.OwJLBF.mount: Deactivated successfully. Feb 23 04:45:45 localhost systemd[1]: var-lib-containers-storage-overlay-c31e04b19edf851b7e7f63f783c207fb50d96460f5ab5c884bc0710e9594cc8d-merged.mount: Deactivated successfully. Feb 23 04:45:45 localhost podman[293022]: 2026-02-23 09:45:45.893106085 +0000 UTC m=+0.079532490 container remove 70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dreamy_davinci, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, maintainer=Guillaume Abrioux , version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_BRANCH=main, RELEASE=main, release=1770267347, vendor=Red Hat, Inc.) Feb 23 04:45:45 localhost systemd[1]: libpod-conmon-70fd54a3f9e42f5f675090e6d59da929a66410e46db8f3233899c631e4e7c5d9.scope: Deactivated successfully. Feb 23 04:45:46 localhost podman[293098]: Feb 23 04:45:46 localhost podman[293098]: 2026-02-23 09:45:46.696560775 +0000 UTC m=+0.057192675 container create c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.buildah.version=1.42.2, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, version=7) Feb 23 04:45:46 localhost systemd[1]: Started libpod-conmon-c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525.scope. Feb 23 04:45:46 localhost systemd[1]: Started libcrun container. Feb 23 04:45:46 localhost ceph-mon[289530]: Saving service mon spec with placement label:mon Feb 23 04:45:46 localhost ceph-mon[289530]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:45:46 localhost ceph-mon[289530]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:45:46 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:46 localhost ceph-mon[289530]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:45:46 localhost ceph-mon[289530]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:45:46 localhost podman[293098]: 2026-02-23 09:45:46.754098589 +0000 UTC m=+0.114730499 container init c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.42.2, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, release=1770267347, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:45:46 localhost podman[293098]: 2026-02-23 09:45:46.762740344 +0000 UTC m=+0.123372244 container start c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z) Feb 23 04:45:46 localhost podman[293098]: 2026-02-23 09:45:46.763000412 +0000 UTC m=+0.123632312 container attach c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-type=git, version=7, maintainer=Guillaume Abrioux , release=1770267347, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:45:46 localhost lucid_mclaren[293113]: 167 167 Feb 23 04:45:46 localhost systemd[1]: libpod-c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525.scope: Deactivated successfully. Feb 23 04:45:46 localhost podman[293098]: 2026-02-23 09:45:46.76553675 +0000 UTC m=+0.126168680 container died c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, release=1770267347, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:45:46 localhost podman[293098]: 2026-02-23 09:45:46.667740191 +0000 UTC m=+0.028372131 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:46 localhost sshd[293129]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:45:46 localhost systemd[1]: var-lib-containers-storage-overlay-3639d597589c16367452ce06da7a6a717d37521643186a666de3899ac3abb8d9-merged.mount: Deactivated successfully. Feb 23 04:45:46 localhost podman[293118]: 2026-02-23 09:45:46.864068993 +0000 UTC m=+0.086546636 container remove c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_mclaren, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, release=1770267347) Feb 23 04:45:46 localhost systemd[1]: libpod-conmon-c6fc7bd83c90b0e6ff2474893ebc0a5635d40c92774b06e0c3f389334c4b3525.scope: Deactivated successfully. Feb 23 04:45:47 localhost podman[293194]: Feb 23 04:45:47 localhost podman[293194]: 2026-02-23 09:45:47.716558636 +0000 UTC m=+0.072778663 container create 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2) Feb 23 04:45:47 localhost systemd[1]: Started libpod-conmon-421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73.scope. Feb 23 04:45:47 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e4f20 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0 Feb 23 04:45:47 localhost ceph-mon[289530]: mon.np0005626463@4(peon) e8 removed from monmap, suicide. Feb 23 04:45:47 localhost podman[293194]: 2026-02-23 09:45:47.686494764 +0000 UTC m=+0.042714831 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:47 localhost systemd[1]: Started libcrun container. Feb 23 04:45:47 localhost podman[293194]: 2026-02-23 09:45:47.799895952 +0000 UTC m=+0.156115979 container init 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vendor=Red Hat, Inc., RELEASE=main, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2) Feb 23 04:45:47 localhost podman[293194]: 2026-02-23 09:45:47.808834766 +0000 UTC m=+0.165054793 container start 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, architecture=x86_64, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, name=rhceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, RELEASE=main, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7) Feb 23 04:45:47 localhost podman[293194]: 2026-02-23 09:45:47.809104324 +0000 UTC m=+0.165324361 container attach 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:45:47 localhost silly_merkle[293212]: 167 167 Feb 23 04:45:47 localhost systemd[1]: libpod-421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73.scope: Deactivated successfully. Feb 23 04:45:47 localhost podman[293194]: 2026-02-23 09:45:47.819033308 +0000 UTC m=+0.175253385 container died 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux , release=1770267347, vcs-type=git, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, name=rhceph, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2) Feb 23 04:45:47 localhost podman[293228]: 2026-02-23 09:45:47.857663263 +0000 UTC m=+0.065728526 container died 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1770267347, CEPH_POINT_RELEASE=, RELEASE=main, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Feb 23 04:45:47 localhost systemd[1]: var-lib-containers-storage-overlay-e5d279caf730b4f32498bcb8a653e6f9b0d17888f70fd8780e2174e019b37e6c-merged.mount: Deactivated successfully. Feb 23 04:45:47 localhost podman[293228]: 2026-02-23 09:45:47.894227494 +0000 UTC m=+0.102292717 container remove 081a8332e685fb2a9081f96d40bdac777e22e1b2c9276d5513069feb8fb9f301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, version=7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:45:47 localhost systemd[1]: tmp-crun.b2stwl.mount: Deactivated successfully. Feb 23 04:45:48 localhost systemd[1]: var-lib-containers-storage-overlay-540c513eff830757a4769e809eb48cf9cd28d0b44842a9010f5253fd2d0aa49d-merged.mount: Deactivated successfully. Feb 23 04:45:48 localhost podman[293241]: 2026-02-23 09:45:48.035125496 +0000 UTC m=+0.206536255 container remove 421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_merkle, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public) Feb 23 04:45:48 localhost systemd[1]: libpod-conmon-421054a75b61ec6e795f4ae3451283bbbee699cc8ccba12613986303cdcc9d73.scope: Deactivated successfully. Feb 23 04:45:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:45:48.547 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:45:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:45:48.549 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:45:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:45:48.550 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:45:48 localhost systemd[1]: ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46@mon.np0005626463.service: Deactivated successfully. Feb 23 04:45:48 localhost systemd[1]: Stopped Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 04:45:48 localhost systemd[1]: ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46@mon.np0005626463.service: Consumed 3.836s CPU time. Feb 23 04:45:48 localhost systemd[1]: Reloading. Feb 23 04:45:48 localhost systemd-sysv-generator[293410]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:45:48 localhost systemd-rc-local-generator[293406]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:45:49 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:45:49 localhost nova_compute[282206]: 2026-02-23 09:45:49.304 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:49 localhost nova_compute[282206]: 2026-02-23 09:45:49.307 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:49 localhost nova_compute[282206]: 2026-02-23 09:45:49.307 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:49 localhost nova_compute[282206]: 2026-02-23 09:45:49.307 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:49 localhost nova_compute[282206]: 2026-02-23 09:45:49.337 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:49 localhost nova_compute[282206]: 2026-02-23 09:45:49.337 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:45:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:45:49 localhost podman[293416]: 2026-02-23 09:45:49.92521666 +0000 UTC m=+0.095718626 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:45:49 localhost podman[293416]: 2026-02-23 09:45:49.966257769 +0000 UTC m=+0.136759745 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:45:49 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:45:50 localhost podman[293417]: 2026-02-23 09:45:50.015356014 +0000 UTC m=+0.185160539 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:45:50 localhost podman[293417]: 2026-02-23 09:45:50.04878704 +0000 UTC m=+0.218591525 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:45:50 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:45:52 localhost ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors Feb 23 04:45:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:45:53 localhost systemd[1]: tmp-crun.Iy5Xn6.mount: Deactivated successfully. Feb 23 04:45:53 localhost podman[293465]: 2026-02-23 09:45:53.921652333 +0000 UTC m=+0.093926922 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:45:53 localhost podman[293465]: 2026-02-23 09:45:53.969383006 +0000 UTC m=+0.141657605 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:45:53 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:45:54 localhost nova_compute[282206]: 2026-02-23 09:45:54.338 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:54 localhost nova_compute[282206]: 2026-02-23 09:45:54.340 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:54 localhost nova_compute[282206]: 2026-02-23 09:45:54.341 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:54 localhost nova_compute[282206]: 2026-02-23 09:45:54.341 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:54 localhost nova_compute[282206]: 2026-02-23 09:45:54.374 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:54 localhost nova_compute[282206]: 2026-02-23 09:45:54.374 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:56 localhost ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors Feb 23 04:45:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:45:57 localhost podman[293545]: Feb 23 04:45:57 localhost podman[293545]: 2026-02-23 09:45:57.269083841 +0000 UTC m=+0.062606792 container create f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, GIT_CLEAN=True, release=1770267347, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z) Feb 23 04:45:57 localhost podman[293537]: 2026-02-23 09:45:57.277940052 +0000 UTC m=+0.086029369 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:45:57 localhost systemd[1]: Started libpod-conmon-f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0.scope. Feb 23 04:45:57 localhost podman[293537]: 2026-02-23 09:45:57.308236511 +0000 UTC m=+0.116325848 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0) Feb 23 04:45:57 localhost systemd[1]: Started libcrun container. Feb 23 04:45:57 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:45:57 localhost podman[293545]: 2026-02-23 09:45:57.23940161 +0000 UTC m=+0.032924561 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:45:57 localhost podman[293545]: 2026-02-23 09:45:57.349891399 +0000 UTC m=+0.143414350 container init f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.42.2, version=7, RELEASE=main, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:45:57 localhost podman[293545]: 2026-02-23 09:45:57.362507165 +0000 UTC m=+0.156030126 container start f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, ceph=True, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, release=1770267347, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, architecture=x86_64) Feb 23 04:45:57 localhost podman[293545]: 2026-02-23 09:45:57.362910018 +0000 UTC m=+0.156432979 container attach f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., release=1770267347, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, RELEASE=main, version=7, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux ) Feb 23 04:45:57 localhost eloquent_poincare[293571]: 167 167 Feb 23 04:45:57 localhost systemd[1]: libpod-f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0.scope: Deactivated successfully. Feb 23 04:45:57 localhost podman[293545]: 2026-02-23 09:45:57.367723806 +0000 UTC m=+0.161246837 container died f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, release=1770267347, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=) Feb 23 04:45:57 localhost podman[293577]: 2026-02-23 09:45:57.45949785 +0000 UTC m=+0.082564353 container remove f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_poincare, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, RELEASE=main, name=rhceph, io.buildah.version=1.42.2, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1770267347, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:45:57 localhost systemd[1]: libpod-conmon-f8164d5b521c5ea5308103c738d39a2e3da0e7c5bf4b6f6d4fb016315b7318a0.scope: Deactivated successfully. Feb 23 04:45:58 localhost systemd[1]: var-lib-containers-storage-overlay-bd2efb1e912333087bd01200a4454987b2f49d206f6677cded7970d3308eca30-merged.mount: Deactivated successfully. Feb 23 04:45:59 localhost nova_compute[282206]: 2026-02-23 09:45:59.375 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:59 localhost nova_compute[282206]: 2026-02-23 09:45:59.377 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:45:59 localhost nova_compute[282206]: 2026-02-23 09:45:59.378 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:45:59 localhost nova_compute[282206]: 2026-02-23 09:45:59.378 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:45:59 localhost nova_compute[282206]: 2026-02-23 09:45:59.416 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:45:59 localhost nova_compute[282206]: 2026-02-23 09:45:59.417 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:00 localhost sshd[293596]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:46:04 localhost nova_compute[282206]: 2026-02-23 09:46:04.418 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:04 localhost nova_compute[282206]: 2026-02-23 09:46:04.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:04 localhost nova_compute[282206]: 2026-02-23 09:46:04.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:04 localhost nova_compute[282206]: 2026-02-23 09:46:04.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:04 localhost nova_compute[282206]: 2026-02-23 09:46:04.457 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:04 localhost nova_compute[282206]: 2026-02-23 09:46:04.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:46:08 localhost systemd[1]: tmp-crun.7bXG0j.mount: Deactivated successfully. Feb 23 04:46:08 localhost podman[293616]: 2026-02-23 09:46:08.497992088 +0000 UTC m=+0.111028456 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:46:08 localhost podman[293616]: 2026-02-23 09:46:08.511327177 +0000 UTC m=+0.124363595 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:46:08 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:46:08 localhost podman[293734]: Feb 23 04:46:09 localhost podman[293734]: 2026-02-23 09:46:09.004056218 +0000 UTC m=+0.062547350 container create 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2026-02-09T10:25:24Z, ceph=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:46:09 localhost systemd[1]: Started libpod-conmon-75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3.scope. Feb 23 04:46:09 localhost systemd[1]: Started libcrun container. Feb 23 04:46:09 localhost podman[293734]: 2026-02-23 09:46:09.072739744 +0000 UTC m=+0.131230906 container init 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, release=1770267347, com.redhat.component=rhceph-container, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, RELEASE=main) Feb 23 04:46:09 localhost podman[293734]: 2026-02-23 09:46:08.975590026 +0000 UTC m=+0.034081178 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:09 localhost podman[293734]: 2026-02-23 09:46:09.083503774 +0000 UTC m=+0.141994936 container start 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git) Feb 23 04:46:09 localhost podman[293734]: 2026-02-23 09:46:09.083790083 +0000 UTC m=+0.142281275 container attach 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, name=rhceph, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, release=1770267347, com.redhat.component=rhceph-container, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=) Feb 23 04:46:09 localhost eager_torvalds[293749]: 167 167 Feb 23 04:46:09 localhost systemd[1]: libpod-75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3.scope: Deactivated successfully. Feb 23 04:46:09 localhost podman[293734]: 2026-02-23 09:46:09.087699074 +0000 UTC m=+0.146190306 container died 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, ceph=True, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:46:09 localhost podman[293756]: 2026-02-23 09:46:09.170454821 +0000 UTC m=+0.073843015 container remove 75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_torvalds, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, build-date=2026-02-09T10:25:24Z, release=1770267347, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main) Feb 23 04:46:09 localhost systemd[1]: libpod-conmon-75829b065fc98c2860e1874c9fb5c5d58e2ff0983a73fb8bc44ded5a8ddcd2e3.scope: Deactivated successfully. Feb 23 04:46:09 localhost podman[293772]: Feb 23 04:46:09 localhost podman[293772]: 2026-02-23 09:46:09.284023154 +0000 UTC m=+0.073508896 container create cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2) Feb 23 04:46:09 localhost systemd[1]: Started libpod-conmon-cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f.scope. Feb 23 04:46:09 localhost systemd[1]: Started libcrun container. Feb 23 04:46:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Feb 23 04:46:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Feb 23 04:46:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 04:46:09 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d/merged/var/lib/ceph/mon/ceph-np0005626463 supports timestamps until 2038 (0x7fffffff) Feb 23 04:46:09 localhost podman[293772]: 2026-02-23 09:46:09.329064296 +0000 UTC m=+0.118550028 container init cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=1770267347) Feb 23 04:46:09 localhost podman[293772]: 2026-02-23 09:46:09.337142404 +0000 UTC m=+0.126628136 container start cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, distribution-scope=public, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux , name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:46:09 localhost podman[293772]: 2026-02-23 09:46:09.337386381 +0000 UTC m=+0.126872123 container attach cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.42.2, GIT_CLEAN=True) Feb 23 04:46:09 localhost podman[293772]: 2026-02-23 09:46:09.253539889 +0000 UTC m=+0.043025621 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:09 localhost podman[242954]: time="2026-02-23T09:46:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:46:09 localhost podman[242954]: @ - - [23/Feb/2026:09:46:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155222 "" "Go-http-client/1.1" Feb 23 04:46:09 localhost systemd[1]: libpod-cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f.scope: Deactivated successfully. Feb 23 04:46:09 localhost nova_compute[282206]: 2026-02-23 09:46:09.459 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:09 localhost nova_compute[282206]: 2026-02-23 09:46:09.462 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:09 localhost nova_compute[282206]: 2026-02-23 09:46:09.462 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:09 localhost nova_compute[282206]: 2026-02-23 09:46:09.463 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:09 localhost systemd[1]: var-lib-containers-storage-overlay-8c3ef492b3e983313cf8b1437f5b28f51c387bcf24144573a1df52680e91fef4-merged.mount: Deactivated successfully. Feb 23 04:46:09 localhost nova_compute[282206]: 2026-02-23 09:46:09.515 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:09 localhost nova_compute[282206]: 2026-02-23 09:46:09.516 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:09 localhost podman[293772]: 2026-02-23 09:46:09.563512265 +0000 UTC m=+0.352997997 container died cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, build-date=2026-02-09T10:25:24Z, vcs-type=git, version=7, RELEASE=main, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=) Feb 23 04:46:09 localhost podman[242954]: @ - - [23/Feb/2026:09:46:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18085 "" "Go-http-client/1.1" Feb 23 04:46:09 localhost systemd[1]: var-lib-containers-storage-overlay-1fa2e38c3824c45d6dbbfdbfe881799379db94eb4530d05812357da5a79bbb7d-merged.mount: Deactivated successfully. Feb 23 04:46:09 localhost podman[293836]: 2026-02-23 09:46:09.702158708 +0000 UTC m=+0.272773317 container remove cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=loving_lamport, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=1770267347, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True) Feb 23 04:46:09 localhost systemd[1]: libpod-conmon-cffe5b2c38b80ff4aaf970f7d508b10080b63fec6f6dc8d0e0fb30efc6a0345f.scope: Deactivated successfully. Feb 23 04:46:09 localhost systemd[1]: Reloading. Feb 23 04:46:09 localhost systemd-sysv-generator[293892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:46:09 localhost systemd-rc-local-generator[293888]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost systemd[1]: Reloading. Feb 23 04:46:10 localhost systemd-rc-local-generator[293957]: /etc/rc.d/rc.local is not marked executable, skipping. Feb 23 04:46:10 localhost systemd-sysv-generator[293964]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Feb 23 04:46:10 localhost podman[293968]: 2026-02-23 09:46:10.353352178 +0000 UTC m=+0.082442249 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, release=1770267347, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:46:10 localhost podman[293968]: 2026-02-23 09:46:10.458680208 +0000 UTC m=+0.187770299 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux , version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-type=git, release=1770267347, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph) Feb 23 04:46:10 localhost systemd[1]: Starting Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46... Feb 23 04:46:10 localhost podman[294100]: Feb 23 04:46:10 localhost podman[294100]: 2026-02-23 09:46:10.928706843 +0000 UTC m=+0.207946768 container create a517a74ed21c459483d3bfd4abd622efb0a723f6a4c7629a8c105935a56ca753 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, name=rhceph, release=1770267347, RELEASE=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, version=7, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:46:10 localhost podman[294100]: 2026-02-23 09:46:10.849150943 +0000 UTC m=+0.128390858 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d196356613b7f1b76af777d48c9c7870d1fb59f6e36b3b4e9bad5f0ed385b1e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Feb 23 04:46:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d196356613b7f1b76af777d48c9c7870d1fb59f6e36b3b4e9bad5f0ed385b1e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Feb 23 04:46:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d196356613b7f1b76af777d48c9c7870d1fb59f6e36b3b4e9bad5f0ed385b1e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Feb 23 04:46:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d196356613b7f1b76af777d48c9c7870d1fb59f6e36b3b4e9bad5f0ed385b1e/merged/var/lib/ceph/mon/ceph-np0005626463 supports timestamps until 2038 (0x7fffffff) Feb 23 04:46:10 localhost podman[294100]: 2026-02-23 09:46:10.98796053 +0000 UTC m=+0.267200445 container init a517a74ed21c459483d3bfd4abd622efb0a723f6a4c7629a8c105935a56ca753 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=) Feb 23 04:46:10 localhost podman[294100]: 2026-02-23 09:46:10.996653607 +0000 UTC m=+0.275893522 container start a517a74ed21c459483d3bfd4abd622efb0a723f6a4c7629a8c105935a56ca753 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mon-np0005626463, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, release=1770267347, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Feb 23 04:46:10 localhost bash[294100]: a517a74ed21c459483d3bfd4abd622efb0a723f6a4c7629a8c105935a56ca753 Feb 23 04:46:11 localhost systemd[1]: Started Ceph mon.np0005626463 for f1fea371-cb69-578d-a3d0-b5c472a84b46. Feb 23 04:46:11 localhost ceph-mon[294160]: set uid:gid to 167:167 (ceph:ceph) Feb 23 04:46:11 localhost ceph-mon[294160]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mon, pid 2 Feb 23 04:46:11 localhost ceph-mon[294160]: pidfile_write: ignore empty --pid-file Feb 23 04:46:11 localhost ceph-mon[294160]: load: jerasure load: lrc Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: RocksDB version: 7.9.2 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Git sha 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Compile date 2026-02-06 00:00:00 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: DB SUMMARY Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: DB Session ID: 66DAQ76CBLV8DSGL8JC7 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: CURRENT file: CURRENT Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: IDENTITY file: IDENTITY Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005626463/store.db dir, Total Num: 0, files: Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005626463/store.db: 000004.log size: 886 ; Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.error_if_exists: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.create_if_missing: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.paranoid_checks: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.flush_verify_memtable_count: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.env: 0x5609fa701a20 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.fs: PosixFileSystem Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.info_log: 0x5609fbabcd20 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_file_opening_threads: 16 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.statistics: (nil) Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.use_fsync: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_log_file_size: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_manifest_file_size: 1073741824 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.log_file_time_to_roll: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.keep_log_file_num: 1000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.recycle_log_file_num: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.allow_fallocate: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.allow_mmap_reads: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.allow_mmap_writes: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.use_direct_reads: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.create_missing_column_families: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.db_log_dir: Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.wal_dir: Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.table_cache_numshardbits: 6 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.WAL_ttl_seconds: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.WAL_size_limit_MB: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.manifest_preallocation_size: 4194304 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.is_fd_close_on_exec: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.advise_random_on_open: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.db_write_buffer_size: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.write_buffer_manager: 0x5609fbacd540 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.access_hint_on_compaction_start: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.random_access_max_buffer_size: 1048576 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.use_adaptive_mutex: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.rate_limiter: (nil) Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.wal_recovery_mode: 2 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.enable_thread_tracking: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.enable_pipelined_write: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.unordered_write: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.allow_concurrent_memtable_write: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.write_thread_max_yield_usec: 100 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.write_thread_slow_yield_usec: 3 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.row_cache: None Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.wal_filter: None Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.avoid_flush_during_recovery: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.allow_ingest_behind: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.two_write_queues: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.manual_wal_flush: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.wal_compression: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.atomic_flush: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.persist_stats_to_disk: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.write_dbid_to_manifest: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.log_readahead_size: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.file_checksum_gen_factory: Unknown Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.best_efforts_recovery: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.allow_data_in_errors: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.db_host_id: __hostname__ Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.enforce_single_del_contracts: true Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_background_jobs: 2 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_background_compactions: -1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_subcompactions: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.avoid_flush_during_shutdown: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.delayed_write_rate : 16777216 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_total_wal_size: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.stats_dump_period_sec: 600 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.stats_persist_period_sec: 600 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.stats_history_buffer_size: 1048576 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_open_files: -1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bytes_per_sync: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.wal_bytes_per_sync: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.strict_bytes_per_sync: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_readahead_size: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_background_flushes: -1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Compression algorithms supported: Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: #011kZSTD supported: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: #011kXpressCompression supported: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: #011kBZip2Compression supported: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: #011kLZ4Compression supported: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: #011kZlibCompression supported: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: #011kLZ4HCCompression supported: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: #011kSnappyCompression supported: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Fast CRC32 supported: Supported on x86 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: DMutex implementation: pthread_mutex_t Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005626463/store.db/MANIFEST-000005 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.comparator: leveldb.BytewiseComparator Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.merge_operator: Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_filter: None Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_filter_factory: None Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.sst_partitioner_factory: None Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.memtable_factory: SkipListFactory Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.table_factory: BlockBasedTable Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5609fbabc980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5609fbab9350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.write_buffer_size: 33554432 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_write_buffer_number: 2 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression: NoCompression Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression: Disabled Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.prefix_extractor: nullptr Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.num_levels: 7 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.level: 32767 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.enabled: false Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.window_bits: -14 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.level: 32767 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.strategy: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.parallel_threads: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.enabled: false Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.level0_stop_writes_trigger: 36 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.target_file_size_base: 67108864 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.target_file_size_multiplier: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_base: 268435456 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_compaction_bytes: 1677721600 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.arena_block_size: 1048576 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.disable_auto_compactions: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_style: kCompactionStyleLevel Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.table_properties_collectors: Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.inplace_update_support: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.inplace_update_num_locks: 10000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.memtable_whole_key_filtering: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.memtable_huge_page_size: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.bloom_locality: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.max_successive_merges: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.optimize_filters_for_hits: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.paranoid_file_checks: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.force_consistency_checks: 1 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.report_bg_io_stats: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.ttl: 2592000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.periodic_compaction_seconds: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.preclude_last_level_data_seconds: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.preserve_internal_time_seconds: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.enable_blob_files: false Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.min_blob_size: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.blob_file_size: 268435456 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.blob_compression_type: NoCompression Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.enable_blob_garbage_collection: false Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.blob_compaction_readahead_size: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.blob_file_starting_level: 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005626463/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 4cfd6c8f-aafa-4003-b2f6-d22c49635dd4 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839971046733, "job": 1, "event": "recovery_started", "wal_files": [4]} Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839971050071, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839971050180, "job": 1, "event": "recovery_finished"} Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x5609fbae0e00 Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: DB pointer 0x5609fbbd6000 Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463 does not exist in monmap, will attempt to join an existing cluster Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:46:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.96 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Sum 1/0 1.96 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.14 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5609fbab9350#2 capacity: 512.00 MB usage: 1.30 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1,1.08 KB,0.000205636%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 23 04:46:11 localhost ceph-mon[294160]: using public_addr v2:172.18.0.103:0/0 -> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] Feb 23 04:46:11 localhost ceph-mon[294160]: starting mon.np0005626463 rank -1 at public addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] at bind addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005626463 fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(???) e0 preinit fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing) e8 sync_obtain_latest_monmap Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing) e8 sync_obtain_latest_monmap obtained monmap e8 Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing).mds e17 new map Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing).mds e17 print_map#012e17#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01116#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-02-23T07:57:46.097663+0000#012modified#0112026-02-23T09:43:29.529267+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01179#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26518}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26518 members: 26518#012[mds.mds.np0005626463.qcthuc{0:26518} state up:active seq 13 addr [v2:172.18.0.106:6808/2515508693,v1:172.18.0.106:6809/2515508693] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005626465.drvnoy{-1:26498} state up:standby seq 1 addr [v2:172.18.0.107:6808/2939113664,v1:172.18.0.107:6809/2939113664] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005626466.vaywlp{-1:26506} state up:standby seq 1 addr [v2:172.18.0.108:6808/2035422599,v1:172.18.0.108:6809/2035422599] compat {c=[1],r=[1],i=[17ff]}] Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing).osd e81 crush map has features 3314933000854323200, adjusting msgr requires Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing).osd e81 crush map has features 432629239337189376, adjusting msgr requires Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626461 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626460 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626463 in quorum (ranks 0,1,2,4) Feb 23 04:46:11 localhost ceph-mon[294160]: Health check failed: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463 (MON_DOWN) Feb 23 04:46:11 localhost ceph-mon[294160]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463 Feb 23 04:46:11 localhost ceph-mon[294160]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463 Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626465 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum) Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: Remove daemons mon.np0005626459 Feb 23 04:46:11 localhost ceph-mon[294160]: Safe to remove mon.np0005626459: new quorum should be ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465', 'np0005626463'] (from ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465', 'np0005626463']) Feb 23 04:46:11 localhost ceph-mon[294160]: Removing monitor np0005626459 from monmap... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "mon rm", "name": "np0005626459"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Removing daemon mon.np0005626459 from np0005626459.localdomain -- ports [] Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626465 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626460 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626461 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3,4) Feb 23 04:46:11 localhost ceph-mon[294160]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005626461,np0005626460,np0005626466,np0005626463) Feb 23 04:46:11 localhost ceph-mon[294160]: Cluster is now healthy Feb 23 04:46:11 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Removed label mon from host np0005626459.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Removed label mgr from host np0005626459.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Removed label _admin from host np0005626459.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Removing np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Removing np0005626459.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:11 localhost ceph-mon[294160]: Removing np0005626459.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Added label _no_schedule to host np0005626459.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626459.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Removing daemon crash.np0005626459 from np0005626459.localdomain -- ports [] Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "client.crash.np0005626459"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "client.crash.np0005626459"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005626459"}]': finished Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Removing key for client.crash.np0005626459 Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Removing daemon mgr.np0005626459.pmtxxl from np0005626459.localdomain -- ports [9283, 8765] Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain"}]': finished Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005626459.pmtxxl"}]': finished Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Removed host np0005626459.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: Removing key for mgr.np0005626459.pmtxxl Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626460 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mon.np0005626460 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626460 on np0005626460.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mon.np0005626461 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Saving service mon spec with placement label:mon Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: Remove daemons mon.np0005626463 Feb 23 04:46:11 localhost ceph-mon[294160]: Safe to remove mon.np0005626463: new quorum should be ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465'] (from ['np0005626461', 'np0005626460', 'np0005626466', 'np0005626465']) Feb 23 04:46:11 localhost ceph-mon[294160]: Removing monitor np0005626463 from monmap... Feb 23 04:46:11 localhost ceph-mon[294160]: Removing daemon mon.np0005626463 from np0005626463.localdomain -- ports [] Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626465 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626461 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626460 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626466,np0005626465 in quorum (ranks 0,2,3) Feb 23 04:46:11 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626461 calling monitor election Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626465 in quorum (ranks 0,1,2,3) Feb 23 04:46:11 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:46:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:11 localhost ceph-mon[294160]: Deploying daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:11 localhost ceph-mon[294160]: mon.np0005626463@-1(synchronizing).paxosservice(auth 1..37) refresh upgraded, format 0 -> 3 Feb 23 04:46:11 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5600 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0 Feb 23 04:46:12 localhost nova_compute[282206]: 2026-02-23 09:46:12.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:12 localhost nova_compute[282206]: 2026-02-23 09:46:12.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:46:12 localhost nova_compute[282206]: 2026-02-23 09:46:12.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:46:12 localhost ceph-mon[294160]: mon.np0005626463@-1(probing) e8 handle_auth_request failed to assign global_id Feb 23 04:46:12 localhost ceph-mon[294160]: mon.np0005626463@-1(probing) e8 handle_auth_request failed to assign global_id Feb 23 04:46:12 localhost ceph-mon[294160]: mon.np0005626463@-1(probing) e8 handle_auth_request failed to assign global_id Feb 23 04:46:12 localhost nova_compute[282206]: 2026-02-23 09:46:12.743 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:46:12 localhost nova_compute[282206]: 2026-02-23 09:46:12.743 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:46:12 localhost nova_compute[282206]: 2026-02-23 09:46:12.744 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:46:12 localhost nova_compute[282206]: 2026-02-23 09:46:12.744 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:46:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:46:12 localhost podman[294268]: 2026-02-23 09:46:12.906280132 +0000 UTC m=+0.074515397 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:46:12 localhost podman[294268]: 2026-02-23 09:46:12.920827128 +0000 UTC m=+0.089062453 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 04:46:12 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:46:13 localhost openstack_network_exporter[245358]: ERROR 09:46:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:46:13 localhost openstack_network_exporter[245358]: Feb 23 04:46:13 localhost openstack_network_exporter[245358]: ERROR 09:46:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:46:13 localhost openstack_network_exporter[245358]: Feb 23 04:46:13 localhost ceph-mon[294160]: mon.np0005626463@-1(probing) e9 my rank is now 4 (was -1) Feb 23 04:46:13 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:46:13 localhost ceph-mon[294160]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Feb 23 04:46:13 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:46:13 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id Feb 23 04:46:13 localhost nova_compute[282206]: 2026-02-23 09:46:13.766 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:46:13 localhost nova_compute[282206]: 2026-02-23 09:46:13.791 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:46:13 localhost nova_compute[282206]: 2026-02-23 09:46:13.792 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.517 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.545 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.545 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.545 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.546 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:14 localhost nova_compute[282206]: 2026-02-23 09:46:14.547 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:15 localhost nova_compute[282206]: 2026-02-23 09:46:15.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:15 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id Feb 23 04:46:15 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id Feb 23 04:46:15 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.053 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.073 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.075 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:46:16 localhost nova_compute[282206]: 2026-02-23 09:46:16.075 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 handle_auth_request failed to assign global_id Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:46:16 localhost ceph-mon[294160]: mgrc update_daemon_metadata mon.np0005626463 metadata {addrs=[v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable),ceph_version_short=18.2.1-381.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005626463.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005626463.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux} Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626461 calling monitor election Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626460 calling monitor election Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626465 calling monitor election Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626460,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3,4) Feb 23 04:46:16 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:46:16 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_auth_request failed to assign global_id Feb 23 04:46:16 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_auth_request failed to assign global_id Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.180 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.243 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.244 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.479 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.481 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11836MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.481 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.592 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.593 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.593 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:46:17 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:17 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:17 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:17 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:17 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:17 localhost nova_compute[282206]: 2026-02-23 09:46:17.677 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:46:17 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_auth_request failed to assign global_id Feb 23 04:46:17 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_auth_request failed to assign global_id Feb 23 04:46:18 localhost nova_compute[282206]: 2026-02-23 09:46:18.111 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:46:18 localhost nova_compute[282206]: 2026-02-23 09:46:18.117 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:46:18 localhost nova_compute[282206]: 2026-02-23 09:46:18.138 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:46:18 localhost nova_compute[282206]: 2026-02-23 09:46:18.140 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:46:18 localhost nova_compute[282206]: 2026-02-23 09:46:18.141 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:46:18 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:18 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:18 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:18 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:18 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:18 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:19 localhost nova_compute[282206]: 2026-02-23 09:46:19.142 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:46:19 localhost nova_compute[282206]: 2026-02-23 09:46:19.548 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:19 localhost nova_compute[282206]: 2026-02-23 09:46:19.585 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:19 localhost nova_compute[282206]: 2026-02-23 09:46:19.585 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:19 localhost nova_compute[282206]: 2026-02-23 09:46:19.586 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:19 localhost nova_compute[282206]: 2026-02-23 09:46:19.586 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:19 localhost nova_compute[282206]: 2026-02-23 09:46:19.587 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:19 localhost ceph-mon[294160]: Reconfiguring crash.np0005626460 (monmap changed)... Feb 23 04:46:19 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain Feb 23 04:46:19 localhost ceph-mon[294160]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON) Feb 23 04:46:19 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:19 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:19 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:19 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:46:20 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)... Feb 23 04:46:20 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain Feb 23 04:46:20 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:20 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:20 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:46:20 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:20 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:20 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:46:20 localhost podman[294670]: 2026-02-23 09:46:20.955709059 +0000 UTC m=+0.089379022 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:46:20 localhost podman[294670]: 2026-02-23 09:46:20.964252631 +0000 UTC m=+0.097922634 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:46:20 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:46:21 localhost podman[294669]: 2026-02-23 09:46:21.050799216 +0000 UTC m=+0.184703916 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 23 04:46:21 localhost podman[294669]: 2026-02-23 09:46:21.092261438 +0000 UTC m=+0.226166118 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true) Feb 23 04:46:21 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:46:22 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:22 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:22 localhost ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:46:22 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:22 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:22 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:46:22 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:22 localhost podman[294769]: Feb 23 04:46:22 localhost podman[294769]: 2026-02-23 09:46:22.910648593 +0000 UTC m=+0.077072475 container create 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, ceph=True, io.buildah.version=1.42.2, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7) Feb 23 04:46:22 localhost systemd[1]: Started libpod-conmon-2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5.scope. Feb 23 04:46:22 localhost podman[294769]: 2026-02-23 09:46:22.879405605 +0000 UTC m=+0.045829517 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:22 localhost systemd[1]: Started libcrun container. Feb 23 04:46:22 localhost podman[294769]: 2026-02-23 09:46:22.995219286 +0000 UTC m=+0.161643168 container init 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, architecture=x86_64, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , name=rhceph, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:46:23 localhost podman[294769]: 2026-02-23 09:46:23.00543047 +0000 UTC m=+0.171854322 container start 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, GIT_CLEAN=True, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347) Feb 23 04:46:23 localhost podman[294769]: 2026-02-23 09:46:23.005592825 +0000 UTC m=+0.172016747 container attach 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, ceph=True, architecture=x86_64, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 23 04:46:23 localhost competent_mclaren[294784]: 167 167 Feb 23 04:46:23 localhost systemd[1]: libpod-2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5.scope: Deactivated successfully. Feb 23 04:46:23 localhost podman[294769]: 2026-02-23 09:46:23.010193525 +0000 UTC m=+0.176617417 container died 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, build-date=2026-02-09T10:25:24Z, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:46:23 localhost podman[294789]: 2026-02-23 09:46:23.117385143 +0000 UTC m=+0.097024057 container remove 2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mclaren, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.buildah.version=1.42.2, GIT_CLEAN=True, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:46:23 localhost systemd[1]: libpod-conmon-2c587f2c7bca22e7b3c4c16a4fc85f338a97d4abd2a83060c49f6749bf53a8a5.scope: Deactivated successfully. Feb 23 04:46:23 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:23 localhost ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:46:23 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:23 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:23 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:46:23 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:23 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:23 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:23 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:46:23 localhost podman[294858]: Feb 23 04:46:23 localhost podman[294858]: 2026-02-23 09:46:23.718604061 +0000 UTC m=+0.065750407 container create 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=1770267347, RELEASE=main, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-type=git, version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , name=rhceph, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z) Feb 23 04:46:23 localhost systemd[1]: Started libpod-conmon-925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36.scope. Feb 23 04:46:23 localhost systemd[1]: Started libcrun container. Feb 23 04:46:23 localhost podman[294858]: 2026-02-23 09:46:23.783700868 +0000 UTC m=+0.130847214 container init 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, ceph=True, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.42.2) Feb 23 04:46:23 localhost podman[294858]: 2026-02-23 09:46:23.688422566 +0000 UTC m=+0.035568952 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:23 localhost podman[294858]: 2026-02-23 09:46:23.793297372 +0000 UTC m=+0.140443718 container start 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, name=rhceph, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.42.2, RELEASE=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:46:23 localhost podman[294858]: 2026-02-23 09:46:23.7935646 +0000 UTC m=+0.140710986 container attach 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:46:23 localhost nifty_cartwright[294873]: 167 167 Feb 23 04:46:23 localhost systemd[1]: libpod-925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36.scope: Deactivated successfully. Feb 23 04:46:23 localhost podman[294858]: 2026-02-23 09:46:23.796247432 +0000 UTC m=+0.143393818 container died 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main) Feb 23 04:46:23 localhost podman[294878]: 2026-02-23 09:46:23.901872222 +0000 UTC m=+0.096819541 container remove 925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_cartwright, GIT_BRANCH=main, vendor=Red Hat, Inc., release=1770267347, build-date=2026-02-09T10:25:24Z, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:46:23 localhost systemd[1]: libpod-conmon-925bbe810d926574da5e77207941d6f0ec02fde21bab3fda09f87b75b2e45e36.scope: Deactivated successfully. Feb 23 04:46:23 localhost systemd[1]: var-lib-containers-storage-overlay-26e4ca7a38c877c345ec4abd06c52a942e386c17a66ca1c6efc4467254a1c811-merged.mount: Deactivated successfully. Feb 23 04:46:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:46:24 localhost podman[294920]: 2026-02-23 09:46:24.394575602 +0000 UTC m=+0.099371989 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, config_id=ceilometer_agent_compute) Feb 23 04:46:24 localhost podman[294920]: 2026-02-23 09:46:24.410199371 +0000 UTC m=+0.114995738 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:46:24 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:46:24 localhost nova_compute[282206]: 2026-02-23 09:46:24.588 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:24 localhost nova_compute[282206]: 2026-02-23 09:46:24.591 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:24 localhost nova_compute[282206]: 2026-02-23 09:46:24.592 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:24 localhost nova_compute[282206]: 2026-02-23 09:46:24.592 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:24 localhost nova_compute[282206]: 2026-02-23 09:46:24.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:24 localhost nova_compute[282206]: 2026-02-23 09:46:24.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:24 localhost ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:46:24 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:46:24 localhost ceph-mon[294160]: Reconfig service osd.default_drive_group Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:24 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:46:24 localhost podman[294975]: Feb 23 04:46:24 localhost podman[294975]: 2026-02-23 09:46:24.802474741 +0000 UTC m=+0.074743223 container create ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , version=7, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, ceph=True, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, release=1770267347) Feb 23 04:46:24 localhost systemd[1]: Started libpod-conmon-ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca.scope. Feb 23 04:46:24 localhost systemd[1]: Started libcrun container. Feb 23 04:46:24 localhost podman[294975]: 2026-02-23 09:46:24.772270714 +0000 UTC m=+0.044539236 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:24 localhost podman[294975]: 2026-02-23 09:46:24.871606391 +0000 UTC m=+0.143874873 container init ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True) Feb 23 04:46:24 localhost podman[294975]: 2026-02-23 09:46:24.88199026 +0000 UTC m=+0.154258792 container start ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, RELEASE=main, version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, ceph=True, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph) Feb 23 04:46:24 localhost podman[294975]: 2026-02-23 09:46:24.882437214 +0000 UTC m=+0.154705746 container attach ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_BRANCH=main, RELEASE=main, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, architecture=x86_64, release=1770267347) Feb 23 04:46:24 localhost quirky_vaughan[294990]: 167 167 Feb 23 04:46:24 localhost systemd[1]: libpod-ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca.scope: Deactivated successfully. Feb 23 04:46:24 localhost podman[294995]: 2026-02-23 09:46:24.955389501 +0000 UTC m=+0.051667266 container died ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, release=1770267347, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, RELEASE=main, build-date=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph) Feb 23 04:46:24 localhost systemd[1]: var-lib-containers-storage-overlay-86e1915b5c9b3973cde7e615ce4bf05e45f2467ef3f3bda96317b67dac1cd812-merged.mount: Deactivated successfully. Feb 23 04:46:24 localhost podman[294995]: 2026-02-23 09:46:24.99580942 +0000 UTC m=+0.092087145 container remove ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_vaughan, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1770267347, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, build-date=2026-02-09T10:25:24Z, architecture=x86_64, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux ) Feb 23 04:46:25 localhost systemd[1]: libpod-conmon-ae2aad132b40befd8a2b3d9ed6de74e43b1dea9f69206921bd1beb7f9a38afca.scope: Deactivated successfully. Feb 23 04:46:25 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e9 handle_command mon_command({"prefix": "mgr fail"} v 0) Feb 23 04:46:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/2634313896' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e81 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Feb 23 04:46:25 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e81 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Feb 23 04:46:25 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 e82: 6 total, 6 up, 6 in Feb 23 04:46:25 localhost systemd-logind[759]: Session 64 logged out. Waiting for processes to exit. Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.673673) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985673763, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 10902, "num_deletes": 256, "total_data_size": 15180226, "memory_usage": 15533664, "flush_reason": "Manual Compaction"} Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985729908, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 13027101, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 10907, "table_properties": {"data_size": 12968467, "index_size": 31904, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25669, "raw_key_size": 271477, "raw_average_key_size": 26, "raw_value_size": 12792918, "raw_average_value_size": 1247, "num_data_blocks": 1226, "num_entries": 10255, "num_filter_entries": 10255, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 1771839971, "file_creation_time": 1771839985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 56428 microseconds, and 29493 cpu microseconds. Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.730101) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 13027101 bytes OK Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.730157) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.732000) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.732028) EVENT_LOG_v1 {"time_micros": 1771839985732021, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.732119) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 15105492, prev total WAL file size 15106296, number of live WAL files 2. Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.735191) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end) Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(2012B)] Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985735311, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 13029113, "oldest_snapshot_seqno": -1} Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10003 keys, 13023826 bytes, temperature: kUnknown Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985809913, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 13023826, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12965832, "index_size": 31909, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25029, "raw_key_size": 266594, "raw_average_key_size": 26, "raw_value_size": 12793573, "raw_average_value_size": 1278, "num_data_blocks": 1225, "num_entries": 10003, "num_filter_entries": 10003, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771839985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.810292) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 13023826 bytes Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.812188) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.4 rd, 174.3 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.4, 0.0 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10260, records dropped: 257 output_compression: NoCompression Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.812227) EVENT_LOG_v1 {"time_micros": 1771839985812210, "job": 4, "event": "compaction_finished", "compaction_time_micros": 74713, "compaction_time_cpu_micros": 38717, "output_level": 6, "num_output_files": 1, "total_output_size": 13023826, "num_input_records": 10260, "num_output_records": 10003, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985814939, "job": 4, "event": "table_file_deletion", "file_number": 14} Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771839985815033, "job": 4, "event": "table_file_deletion", "file_number": 8} Feb 23 04:46:25 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:25.735078) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:46:25 localhost ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)... Feb 23 04:46:25 localhost ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.14193 172.18.0.105:0/3093743034' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.14193 ' entity='mgr.np0005626461.lrfquh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: from='client.? 172.18.0.200:0/2634313896' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: Activating manager daemon np0005626460.fyrady Feb 23 04:46:25 localhost ceph-mon[294160]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 23 04:46:25 localhost ceph-mon[294160]: Manager daemon np0005626460.fyrady is now available Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"}]': finished Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"} : dispatch Feb 23 04:46:25 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626459.localdomain.devices.0"}]': finished Feb 23 04:46:25 localhost podman[295073]: Feb 23 04:46:25 localhost podman[295073]: 2026-02-23 09:46:25.861567301 +0000 UTC m=+0.081036706 container create df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:46:25 localhost systemd[1]: Started libpod-conmon-df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525.scope. Feb 23 04:46:25 localhost podman[295073]: 2026-02-23 09:46:25.829491768 +0000 UTC m=+0.048961213 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:25 localhost systemd[1]: Started libcrun container. Feb 23 04:46:25 localhost podman[295073]: 2026-02-23 09:46:25.947469146 +0000 UTC m=+0.166938531 container init df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:46:25 localhost sshd[295091]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:46:25 localhost systemd[1]: libpod-df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525.scope: Deactivated successfully. Feb 23 04:46:25 localhost affectionate_boyd[295088]: 167 167 Feb 23 04:46:25 localhost podman[295073]: 2026-02-23 09:46:25.960551517 +0000 UTC m=+0.180020932 container start df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, RELEASE=main, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, description=Red Hat Ceph Storage 7, name=rhceph, release=1770267347, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=) Feb 23 04:46:25 localhost podman[295073]: 2026-02-23 09:46:25.960846546 +0000 UTC m=+0.180315971 container attach df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True) Feb 23 04:46:25 localhost podman[295073]: 2026-02-23 09:46:25.963515628 +0000 UTC m=+0.182985063 container died df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:46:26 localhost systemd-logind[759]: New session 67 of user ceph-admin. Feb 23 04:46:26 localhost podman[295095]: 2026-02-23 09:46:26.05616274 +0000 UTC m=+0.085757342 container remove df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_boyd, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, ceph=True, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, release=1770267347, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:46:26 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1019644620 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:46:26 localhost systemd[1]: Started Session 67 of User ceph-admin. Feb 23 04:46:26 localhost systemd[1]: libpod-conmon-df91ff37b17b5e6496cfe283e3fb46bb598f98104f82230101be61a8357f5525.scope: Deactivated successfully. Feb 23 04:46:26 localhost systemd[1]: session-64.scope: Deactivated successfully. Feb 23 04:46:26 localhost systemd[1]: session-64.scope: Consumed 24.553s CPU time. Feb 23 04:46:26 localhost systemd-logind[759]: Removed session 64. Feb 23 04:46:26 localhost ceph-mon[294160]: removing stray HostCache host record np0005626459.localdomain.devices.0 Feb 23 04:46:26 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626460.fyrady/mirror_snapshot_schedule"} : dispatch Feb 23 04:46:26 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626460.fyrady/mirror_snapshot_schedule"} : dispatch Feb 23 04:46:26 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626460.fyrady/trash_purge_schedule"} : dispatch Feb 23 04:46:26 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626460.fyrady/trash_purge_schedule"} : dispatch Feb 23 04:46:26 localhost systemd[1]: var-lib-containers-storage-overlay-a90b42d502dd51f8451e7a369fbadfad49a814ef823e8e68fce767d255c472e3-merged.mount: Deactivated successfully. Feb 23 04:46:27 localhost systemd[1]: tmp-crun.G4IdEQ.mount: Deactivated successfully. Feb 23 04:46:27 localhost podman[295220]: 2026-02-23 09:46:27.084478206 +0000 UTC m=+0.093892419 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Feb 23 04:46:27 localhost podman[295220]: 2026-02-23 09:46:27.182715926 +0000 UTC m=+0.192130199 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux , release=1770267347, GIT_BRANCH=main, vendor=Red Hat, Inc., io.buildah.version=1.42.2) Feb 23 04:46:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:46:27 localhost podman[295272]: 2026-02-23 09:46:27.504594872 +0000 UTC m=+0.135474994 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:46:27 localhost podman[295272]: 2026-02-23 09:46:27.537250536 +0000 UTC m=+0.168130638 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS) Feb 23 04:46:27 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:46:27 localhost ceph-mon[294160]: [23/Feb/2026:09:46:26] ENGINE Bus STARTING Feb 23 04:46:27 localhost ceph-mon[294160]: [23/Feb/2026:09:46:26] ENGINE Serving on http://172.18.0.104:8765 Feb 23 04:46:27 localhost ceph-mon[294160]: [23/Feb/2026:09:46:27] ENGINE Serving on https://172.18.0.104:7150 Feb 23 04:46:27 localhost ceph-mon[294160]: [23/Feb/2026:09:46:27] ENGINE Bus STARTED Feb 23 04:46:27 localhost ceph-mon[294160]: [23/Feb/2026:09:46:27] ENGINE Client ('172.18.0.104', 55702) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 23 04:46:27 localhost ceph-mon[294160]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s)) Feb 23 04:46:27 localhost ceph-mon[294160]: Cluster is now healthy Feb 23 04:46:27 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:27 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:27 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:27 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:28 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:28 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:28 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:28 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:28 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:28 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:29 localhost nova_compute[282206]: 2026-02-23 09:46:29.632 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:29 localhost nova_compute[282206]: 2026-02-23 09:46:29.635 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:29 localhost nova_compute[282206]: 2026-02-23 09:46:29.636 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:29 localhost nova_compute[282206]: 2026-02-23 09:46:29.636 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:29 localhost nova_compute[282206]: 2026-02-23 09:46:29.637 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:29 localhost nova_compute[282206]: 2026-02-23 09:46:29.638 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 04:46:30 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 04:46:30 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:46:30 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:46:30 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:30 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:30 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:30 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:30 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:31 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1020046648 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:46:31 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:31 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:31 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:31 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:31 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:31 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:31 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:33 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:33 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:33 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:33 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:33 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:33 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:33 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:33 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:34 localhost nova_compute[282206]: 2026-02-23 09:46:34.638 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:34 localhost nova_compute[282206]: 2026-02-23 09:46:34.640 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:34 localhost nova_compute[282206]: 2026-02-23 09:46:34.640 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:34 localhost nova_compute[282206]: 2026-02-23 09:46:34.641 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:34 localhost nova_compute[282206]: 2026-02-23 09:46:34.698 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:34 localhost nova_compute[282206]: 2026-02-23 09:46:34.699 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:34 localhost sshd[296139]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:46:35 localhost sshd[296189]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:46:35 localhost podman[296195]: Feb 23 04:46:35 localhost podman[296195]: 2026-02-23 09:46:35.440989756 +0000 UTC m=+0.085657269 container create 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , release=1770267347) Feb 23 04:46:35 localhost systemd[1]: Started libpod-conmon-0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3.scope. Feb 23 04:46:35 localhost systemd[1]: Started libcrun container. Feb 23 04:46:35 localhost podman[296195]: 2026-02-23 09:46:35.406712653 +0000 UTC m=+0.051380166 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:35 localhost podman[296195]: 2026-02-23 09:46:35.512031847 +0000 UTC m=+0.156699350 container init 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container) Feb 23 04:46:35 localhost podman[296195]: 2026-02-23 09:46:35.523207457 +0000 UTC m=+0.167874960 container start 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Feb 23 04:46:35 localhost podman[296195]: 2026-02-23 09:46:35.523471265 +0000 UTC m=+0.168138818 container attach 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, RELEASE=main, name=rhceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph) Feb 23 04:46:35 localhost cool_gagarin[296210]: 167 167 Feb 23 04:46:35 localhost systemd[1]: libpod-0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3.scope: Deactivated successfully. Feb 23 04:46:35 localhost podman[296195]: 2026-02-23 09:46:35.528656163 +0000 UTC m=+0.173323686 container died 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, version=7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7) Feb 23 04:46:35 localhost podman[296215]: 2026-02-23 09:46:35.617672843 +0000 UTC m=+0.080155921 container remove 0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_gagarin, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Feb 23 04:46:35 localhost systemd[1]: libpod-conmon-0b288183a06c1b9b312330d7dc7cd22e507cfd431ac128cd722b6545a0a925c3.scope: Deactivated successfully. Feb 23 04:46:35 localhost ceph-mon[294160]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 23 04:46:35 localhost ceph-mon[294160]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 23 04:46:35 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:46:35 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:35 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:35 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:46:36 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054580 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:46:36 localhost podman[296284]: Feb 23 04:46:36 localhost podman[296284]: 2026-02-23 09:46:36.331644403 +0000 UTC m=+0.078417698 container create 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, name=rhceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, ceph=True, distribution-scope=public, version=7, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:46:36 localhost systemd[1]: Started libpod-conmon-57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42.scope. Feb 23 04:46:36 localhost systemd[1]: Started libcrun container. Feb 23 04:46:36 localhost podman[296284]: 2026-02-23 09:46:36.393312389 +0000 UTC m=+0.140085724 container init 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, architecture=x86_64, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:46:36 localhost podman[296284]: 2026-02-23 09:46:36.297900696 +0000 UTC m=+0.044674011 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:36 localhost podman[296284]: 2026-02-23 09:46:36.400499748 +0000 UTC m=+0.147273043 container start 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2026-02-09T10:25:24Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.42.2, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, release=1770267347) Feb 23 04:46:36 localhost podman[296284]: 2026-02-23 09:46:36.400939971 +0000 UTC m=+0.147713266 container attach 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux ) Feb 23 04:46:36 localhost exciting_swanson[296300]: 167 167 Feb 23 04:46:36 localhost systemd[1]: libpod-57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42.scope: Deactivated successfully. Feb 23 04:46:36 localhost podman[296284]: 2026-02-23 09:46:36.404404137 +0000 UTC m=+0.151177452 container died 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, build-date=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.42.2, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main) Feb 23 04:46:36 localhost systemd[1]: var-lib-containers-storage-overlay-c1e744a8ad4bd2d9f8790c7b7d1756aecd0c747ece9a113d10c94dcad058ee51-merged.mount: Deactivated successfully. Feb 23 04:46:36 localhost systemd[1]: var-lib-containers-storage-overlay-b93adf770a490ffdcc678b074d950fbad1c73cb20c852b53bdf2231751fdb3be-merged.mount: Deactivated successfully. Feb 23 04:46:36 localhost podman[296305]: 2026-02-23 09:46:36.511687112 +0000 UTC m=+0.090986690 container remove 57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_swanson, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1770267347, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True) Feb 23 04:46:36 localhost systemd[1]: libpod-conmon-57e941f9c9b822e0a1f95b7369799a6cc5b05c9521cb2924197e9e94ecb66c42.scope: Deactivated successfully. Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:36 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:36 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:36 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:37 localhost ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:46:37 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:46:37 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:37 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:37 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:46:38 localhost ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:46:38 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:46:38 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:38 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:38 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:38 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:38 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:46:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:46:38 localhost podman[296322]: 2026-02-23 09:46:38.92434721 +0000 UTC m=+0.090351570 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:46:38 localhost podman[296322]: 2026-02-23 09:46:38.935666975 +0000 UTC m=+0.101671365 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:46:38 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:46:39 localhost podman[242954]: time="2026-02-23T09:46:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:46:39 localhost podman[242954]: @ - - [23/Feb/2026:09:46:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:46:39 localhost podman[242954]: @ - - [23/Feb/2026:09:46:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18260 "" "Go-http-client/1.1" Feb 23 04:46:39 localhost nova_compute[282206]: 2026-02-23 09:46:39.700 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:39 localhost nova_compute[282206]: 2026-02-23 09:46:39.702 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:39 localhost nova_compute[282206]: 2026-02-23 09:46:39.702 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:39 localhost nova_compute[282206]: 2026-02-23 09:46:39.702 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:39 localhost nova_compute[282206]: 2026-02-23 09:46:39.746 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:39 localhost nova_compute[282206]: 2026-02-23 09:46:39.747 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:39 localhost ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:46:39 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:46:39 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:39 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:39 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:39 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:39 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:39 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:39 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:40 localhost ceph-mon[294160]: Saving service mon spec with placement label:mon Feb 23 04:46:40 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:46:40 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:46:40 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:40 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:40 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:40 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:41 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:46:41 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:46:41 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:46:41 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:41 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:41 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:42 localhost ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)... Feb 23 04:46:42 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:46:42 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:42 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:42 localhost ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:46:42 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:42 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:46:42 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:46:43 localhost openstack_network_exporter[245358]: ERROR 09:46:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:46:43 localhost openstack_network_exporter[245358]: Feb 23 04:46:43 localhost openstack_network_exporter[245358]: ERROR 09:46:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:46:43 localhost openstack_network_exporter[245358]: Feb 23 04:46:43 localhost systemd[1]: session-65.scope: Deactivated successfully. Feb 23 04:46:43 localhost systemd[1]: session-65.scope: Consumed 1.746s CPU time. Feb 23 04:46:43 localhost systemd-logind[759]: Session 65 logged out. Waiting for processes to exit. Feb 23 04:46:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:46:43 localhost systemd-logind[759]: Removed session 65. Feb 23 04:46:43 localhost podman[296345]: 2026-02-23 09:46:43.553994643 +0000 UTC m=+0.094672003 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, version=9.7, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, architecture=x86_64, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:46:43 localhost podman[296345]: 2026-02-23 09:46:43.56639855 +0000 UTC m=+0.107075890 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:46:43 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:46:44 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:44 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:44 localhost ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:46:44 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:46:44 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:46:44 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:44 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:44 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:44 localhost nova_compute[282206]: 2026-02-23 09:46:44.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:44 localhost nova_compute[282206]: 2026-02-23 09:46:44.752 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:44 localhost nova_compute[282206]: 2026-02-23 09:46:44.752 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:44 localhost nova_compute[282206]: 2026-02-23 09:46:44.752 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:44 localhost nova_compute[282206]: 2026-02-23 09:46:44.792 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:44 localhost nova_compute[282206]: 2026-02-23 09:46:44.793 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:45 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:45 localhost ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)... Feb 23 04:46:45 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:46:45 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:46:45 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:46 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:46:46 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:46 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:46:46 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:46:47 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:46:47 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:46:47 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:47 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:47 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:46:48.548 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:46:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:46:48.548 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:46:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:46:48.549 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:46:48 localhost ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:46:48 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:46:48 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:48 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:48 localhost ceph-mon[294160]: from='mgr.24104 172.18.0.104:0/1738626939' entity='mgr.np0005626460.fyrady' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:46:48 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:48 localhost ceph-mon[294160]: from='mgr.24104 ' entity='mgr.np0005626460.fyrady' Feb 23 04:46:48 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e83 e83: 6 total, 6 up, 6 in Feb 23 04:46:49 localhost systemd[1]: session-67.scope: Deactivated successfully. Feb 23 04:46:49 localhost systemd[1]: session-67.scope: Consumed 7.147s CPU time. Feb 23 04:46:49 localhost systemd-logind[759]: Session 67 logged out. Waiting for processes to exit. Feb 23 04:46:49 localhost systemd-logind[759]: Removed session 67. Feb 23 04:46:49 localhost sshd[296382]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:46:49 localhost systemd-logind[759]: New session 68 of user ceph-admin. Feb 23 04:46:49 localhost systemd[1]: Started Session 68 of User ceph-admin. Feb 23 04:46:49 localhost ceph-mon[294160]: from='client.? 172.18.0.200:0/3611724471' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:46:49 localhost ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:46:49 localhost ceph-mon[294160]: Activating manager daemon np0005626465.hlpkwo Feb 23 04:46:49 localhost ceph-mon[294160]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 23 04:46:49 localhost ceph-mon[294160]: Manager daemon np0005626465.hlpkwo is now available Feb 23 04:46:49 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch Feb 23 04:46:49 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch Feb 23 04:46:49 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch Feb 23 04:46:49 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch Feb 23 04:46:49 localhost nova_compute[282206]: 2026-02-23 09:46:49.793 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:49 localhost nova_compute[282206]: 2026-02-23 09:46:49.796 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:49 localhost nova_compute[282206]: 2026-02-23 09:46:49.797 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:49 localhost nova_compute[282206]: 2026-02-23 09:46:49.797 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:49 localhost nova_compute[282206]: 2026-02-23 09:46:49.847 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:49 localhost nova_compute[282206]: 2026-02-23 09:46:49.848 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:50 localhost podman[296490]: 2026-02-23 09:46:50.494316251 +0000 UTC m=+0.091537906 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:46:50 localhost podman[296490]: 2026-02-23 09:46:50.627481424 +0000 UTC m=+0.224703089 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container) Feb 23 04:46:50 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Feb 23 04:46:50 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:50.973543) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:46:50 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Feb 23 04:46:50 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840010973606, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1523, "num_deletes": 255, "total_data_size": 8165950, "memory_usage": 8638144, "flush_reason": "Manual Compaction"} Feb 23 04:46:50 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011008524, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5001842, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10912, "largest_seqno": 12430, "table_properties": {"data_size": 4994876, "index_size": 3855, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 17591, "raw_average_key_size": 21, "raw_value_size": 4979849, "raw_average_value_size": 6147, "num_data_blocks": 160, "num_entries": 810, "num_filter_entries": 810, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839985, "oldest_key_time": 1771839985, "file_creation_time": 1771840010, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 35055 microseconds, and 10182 cpu microseconds. Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.008600) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5001842 bytes OK Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.008628) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.011159) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.011184) EVENT_LOG_v1 {"time_micros": 1771840011011177, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.011208) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 8157902, prev total WAL file size 8166459, number of live WAL files 2. Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.012806) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303231' seq:72057594037927935, type:22 .. '6B760031323734' seq:0, type:0; will stop at (end) Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(4884KB)], [15(12MB)] Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011012954, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 18025668, "oldest_snapshot_seqno": -1} Feb 23 04:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:46:51 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:46:51 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:51 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10291 keys, 16995268 bytes, temperature: kUnknown Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011168429, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 16995268, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16935229, "index_size": 33250, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25733, "raw_key_size": 275322, "raw_average_key_size": 26, "raw_value_size": 16757620, "raw_average_value_size": 1628, "num_data_blocks": 1263, "num_entries": 10291, "num_filter_entries": 10291, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840011, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.168745) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 16995268 bytes Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.170699) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 115.9 rd, 109.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 12.4 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(7.0) write-amplify(3.4) OK, records in: 10813, records dropped: 522 output_compression: NoCompression Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.170718) EVENT_LOG_v1 {"time_micros": 1771840011170709, "job": 6, "event": "compaction_finished", "compaction_time_micros": 155573, "compaction_time_cpu_micros": 38908, "output_level": 6, "num_output_files": 1, "total_output_size": 16995268, "num_input_records": 10813, "num_output_records": 10291, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011171570, "job": 6, "event": "table_file_deletion", "file_number": 17} Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840011173093, "job": 6, "event": "table_file_deletion", "file_number": 15} Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.012657) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173152) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173155) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:46:51 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:46:51.173158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:46:51 localhost podman[296581]: 2026-02-23 09:46:51.187144498 +0000 UTC m=+0.103943815 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:46:51 localhost podman[296581]: 2026-02-23 09:46:51.200498653 +0000 UTC m=+0.117297970 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:46:51 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:46:51 localhost podman[296618]: 2026-02-23 09:46:51.291134432 +0000 UTC m=+0.085950477 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:46:51 localhost podman[296618]: 2026-02-23 09:46:51.347350643 +0000 UTC m=+0.142166688 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:46:51 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:46:53 localhost ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Bus STARTING Feb 23 04:46:53 localhost ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Serving on https://172.18.0.107:7150 Feb 23 04:46:53 localhost ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Client ('172.18.0.107', 34506) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 23 04:46:53 localhost ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Serving on http://172.18.0.107:8765 Feb 23 04:46:53 localhost ceph-mon[294160]: [23/Feb/2026:09:46:50] ENGINE Bus STARTED Feb 23 04:46:53 localhost ceph-mon[294160]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 23 04:46:53 localhost ceph-mon[294160]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 23 04:46:53 localhost ceph-mon[294160]: Cluster is now healthy Feb 23 04:46:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:53 localhost systemd[1]: Stopping User Manager for UID 1003... Feb 23 04:46:53 localhost systemd[292385]: Activating special unit Exit the Session... Feb 23 04:46:53 localhost systemd[292385]: Stopped target Main User Target. Feb 23 04:46:53 localhost systemd[292385]: Stopped target Basic System. Feb 23 04:46:53 localhost systemd[292385]: Stopped target Paths. Feb 23 04:46:53 localhost systemd[292385]: Stopped target Sockets. Feb 23 04:46:53 localhost systemd[292385]: Stopped target Timers. Feb 23 04:46:53 localhost systemd[292385]: Stopped Mark boot as successful after the user session has run 2 minutes. Feb 23 04:46:53 localhost systemd[292385]: Stopped Daily Cleanup of User's Temporary Directories. Feb 23 04:46:53 localhost systemd[292385]: Closed D-Bus User Message Bus Socket. Feb 23 04:46:53 localhost systemd[292385]: Stopped Create User's Volatile Files and Directories. Feb 23 04:46:53 localhost systemd[292385]: Removed slice User Application Slice. Feb 23 04:46:53 localhost systemd[292385]: Reached target Shutdown. Feb 23 04:46:53 localhost systemd[292385]: Finished Exit the Session. Feb 23 04:46:53 localhost systemd[292385]: Reached target Exit the Session. Feb 23 04:46:53 localhost systemd[1]: user@1003.service: Deactivated successfully. Feb 23 04:46:53 localhost systemd[1]: Stopped User Manager for UID 1003. Feb 23 04:46:53 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Feb 23 04:46:53 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Feb 23 04:46:53 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Feb 23 04:46:53 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Feb 23 04:46:53 localhost systemd[1]: Removed slice User Slice of UID 1003. Feb 23 04:46:53 localhost systemd[1]: user-1003.slice: Consumed 2.336s CPU time. Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd/host:np0005626460", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:46:54 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:46:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:46:54 localhost podman[297014]: 2026-02-23 09:46:54.525722437 +0000 UTC m=+0.074538509 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute) Feb 23 04:46:54 localhost podman[297014]: 2026-02-23 09:46:54.563185848 +0000 UTC m=+0.112001910 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:46:54 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:46:54 localhost nova_compute[282206]: 2026-02-23 09:46:54.848 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:54 localhost nova_compute[282206]: 2026-02-23 09:46:54.851 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:54 localhost nova_compute[282206]: 2026-02-23 09:46:54.851 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:54 localhost nova_compute[282206]: 2026-02-23 09:46:54.852 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:54 localhost nova_compute[282206]: 2026-02-23 09:46:54.879 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:54 localhost nova_compute[282206]: 2026-02-23 09:46:54.879 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:55 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 04:46:55 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 04:46:55 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 04:46:55 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:46:55 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 04:46:55 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:55 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:46:56 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.144 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50f6a5be-6717-43be-99c5-5d4ccece8b90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.135243', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '971db5a0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '32f1929f900d4aea43a62c479d5a5028f8af349febca1f926653e0ffa498df5d'}]}, 'timestamp': '2026-02-23 09:46:56.144828', '_unique_id': 'f402c5c58aa441398695826817720c8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.146 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.147 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.147 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7afda6ea-bf69-4f7a-bb24-1ac48eb7e4e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.147810', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '971e40d8-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '817a2d413328e026eccd4cc22afd0b1f63dd73010c37aeed7149f41e94f6cde9'}]}, 'timestamp': '2026-02-23 09:46:56.148335', '_unique_id': 'dee426847d754399b8157bd121d3b41f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.149 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.150 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.150 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c87ace0-8350-4f5a-b336-cb99c696cbe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.150782', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '971eb48c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '49f55398390563c331f7215bccd333c40c4f1422c97541d69a3d1ecb2ca50db4'}]}, 'timestamp': '2026-02-23 09:46:56.151293', '_unique_id': 'bf8d63f508c648fa8cbbc20b65fe98db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.153 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.153 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '130d72d6-8ca4-4459-982b-ddf8e6464072', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.153467', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '971f1bd4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '08df1b73a50b7d7f0f5947a1ac1ea3baacf9547b29ad625a0344a260b5513a9d'}]}, 'timestamp': '2026-02-23 09:46:56.154050', '_unique_id': '3c7e245b36db45bebfb88d358c922049'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.155 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.188 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.189 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a87d629-b26d-4c9d-bd9d-3cf94b97aa71', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.156851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '97247f8e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'c60245ed4a15df92172704893f3c34b63df78aa94b426b511fe586539d53d87a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.156851', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '97249424-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'fbfd08862eaaa809c8ee7260dbe63af02128df8fc4308f5267f8d4a619bf012c'}]}, 'timestamp': '2026-02-23 09:46:56.189774', '_unique_id': '4f0ab4a436d94f73ba01ea1a6a391697'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.193 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c41c68c0-fbd5-4193-a945-f4dacf831605', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.193051', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '972528b2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': 'b67fbaba54ed604ad87e6e21746330a6c24658b47652865d25d649c1a40d0f65'}]}, 'timestamp': '2026-02-23 09:46:56.193682', '_unique_id': 'b979d4bfd7194ec0aa228fe7ea015995'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.196 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3f99d46-0645-47bc-b50e-fcbbff405822', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.196449', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '9725ab98-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '2bf74b9ef3e0dffec9d3da48446269b01652b43f0f664765a11699ab4b362618'}]}, 'timestamp': '2026-02-23 09:46:56.197009', '_unique_id': 'e2c05bcb21b8449c881f8edb7fd779ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e525eb79-3459-4112-a2de-2f99eec15dcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.199278', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '972619f2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': 'f098cbd53e94351a71422e088f33773ea0762d113a4c4b8a51d077873dd24357'}]}, 'timestamp': '2026-02-23 09:46:56.199766', '_unique_id': 'c803b1a057c34fd48b7ade36c6adef90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.201 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.202 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.219 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef2f0845-bc0e-4265-853c-150b72a8b49d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:46:56.202229', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '97292b42-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.408451054, 'message_signature': '0d2021a7a4ff6615072b1996dc7ac21e40c64008886ac04cda2bd2a97a3de307'}]}, 'timestamp': '2026-02-23 09:46:56.219963', '_unique_id': '20b89140c85341fdad567e0b7caedf7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.222 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e259485-a02b-4f77-bb1d-bfe28bafa3bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.223022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9729ba3a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '0aa7afac3bf008618a70d6b83f5346f46b2e4770104ca9e5a1c365331e8914b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.223022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9729ccbe-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '3d75ee7651d95319f974d25bd12fe40e7534dbabcf3d15afbc377c925b014548'}]}, 'timestamp': '2026-02-23 09:46:56.224040', '_unique_id': '000b6d0fbb8e4c5a8b94b217b5af3684'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11bad915-f399-4537-b46b-a5a13026bf05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.226360', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '972a3c6c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': 'c0c8d224ed1376f72983b07656dc4430c75d594dc35e4d7783b0cf192c0a23a5'}]}, 'timestamp': '2026-02-23 09:46:56.226859', '_unique_id': '29c23c526fed4282987afaeaf49b5909'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'deb476a7-2403-4782-ad84-c3cb23feee35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.229463', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972ab8e0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'ed4c1510ae0720b419b8c623afd3d9e98738ed8f4b619b1bab4a58be329e7d15'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.229463', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972ad4e2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'a7ae7cdad8370c765d34adae1f9b2449299c4e21b11b7f2c56eab468cfa10664'}]}, 'timestamp': '2026-02-23 09:46:56.230805', '_unique_id': '938fec49ee2c4015b76f024f73322a31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0560c791-c276-4fde-b8bb-389bdf4796db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.234385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972b77ee-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'ac3b900e4df4a2561f9182b9550bac38f5beb64a8fc00dd31222affef44baa08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.234385', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972b8c0c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': 'e588c37614ba5678030da1f150fb822f6d48b5901e07a15c57474166ea640dbd'}]}, 'timestamp': '2026-02-23 09:46:56.235431', '_unique_id': 'b9f1734974de4b4796341244602bab96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ac21f0c-818c-4429-8225-8b9ded36df9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.238181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972c11a4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '3167be254df7d66328cc07ac4184b53a68920976f2d80d882a46640d21343e18'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.238181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972c2af4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '094795e4dc6b1001a1fe44e2d6e5fd88ecc522d51424e79ec593c15f647c2086'}]}, 'timestamp': '2026-02-23 09:46:56.239569', '_unique_id': '6d4af4d0b0f543f5a5ea11230ec104de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d7f1246-0795-4b32-972c-b11f3c7dac82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.242189', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972e2fc0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': '91946ebf0a00c888fd005c5a944f32215d1fdda8856a67a128ea251b749acf76'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.242189', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972e43b6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': '8da4676c95cb48ff037cc1458874c715ea21565d63d0bc6ae6e362cfc5ef5282'}]}, 'timestamp': '2026-02-23 09:46:56.253250', '_unique_id': '1f475e546ffb4b9cba144bec0183a89d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4de55d29-983b-4880-9bf4-5a651ebbb937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.255633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972eb468-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': 'c60f9fdcf203eff2df5745dabb324c7a20fe2d190256e2653f77754da4b2e61e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.255633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972ecbb0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': '285e0014d9dc2d155728e923af62f2c37253e5fde9c0e533021f21eec41eea37'}]}, 'timestamp': '2026-02-23 09:46:56.256709', '_unique_id': '00cf6494071d4c6aa076b9e923258916'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.259 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 10890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee662f1b-7a04-4380-aa82-7c95446b7810', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10890000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:46:56.259247', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '972f4040-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.408451054, 'message_signature': '456f70bcf4c26d25f381f9b4452bfc880f446ddfe7fb60290ccc488b59f20c10'}]}, 'timestamp': '2026-02-23 09:46:56.259708', '_unique_id': '844c38cae9fe43f1bae020364771735e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.261 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf0e9db4-8810-4000-9610-599a2914ce9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.262022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '972fac92-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '3c26ae9337b9e03afa22ef13fc4df14b15cd580f84af52ab68e7df0f9bff11a4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.262022', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '972fbebc-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.346406506, 'message_signature': '9d109338901a97533fd29e5949364a5a54eebdb439facbf05ba3d8c932e3ac5e'}]}, 'timestamp': '2026-02-23 09:46:56.262983', '_unique_id': '54bd4531a4e249f7a8bb48a11a062650'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.265 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '018be324-7fd1-4161-b80d-728d6a1885fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.265411', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '973030ea-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': '470894224921996d5c1b155f6c58765098121eb09103d46915e622a45489f78e'}]}, 'timestamp': '2026-02-23 09:46:56.265930', '_unique_id': '58cbb17171f84b1b92322e3743027f8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.266 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.268 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45442e7c-3697-4d37-a5ef-796e473e062d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:46:56.268144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '97309b3e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': 'dd397b966d9036ca9dfdf723a97985611892d21772da08121f5d0b0efd6e8a0c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:46:56.268144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9730ab92-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.431705621, 'message_signature': 'a4792ee911e37f8fbf641fa38313229fc94d575c8f5f016b238f1ce4d4cf5897'}]}, 'timestamp': '2026-02-23 09:46:56.269062', '_unique_id': 'b2347a7fd7b34a508fccf5ef4b4845be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.270 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.271 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fa08527-a3bc-40ad-b9a5-ab1d9e49be59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:46:56.271333', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '9731180c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11456.324725675, 'message_signature': 'b32890d6dd8e2ab8368c64732e9aa9fa4f73e7b1fbfba5b23afb2c8688ff2aed'}]}, 'timestamp': '2026-02-23 09:46:56.271799', '_unique_id': '128c08b855b94a0b9380170e4403c402'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 04:46:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:46:56.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:46:56 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:56 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:56 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:56 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:56 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:46:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:57 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:57 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:57 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:57 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:57 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:46:57 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:57 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:57 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:57 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:57 localhost ceph-mon[294160]: Reconfiguring mon.np0005626460 (monmap changed)... Feb 23 04:46:57 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:57 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626460 on np0005626460.localdomain Feb 23 04:46:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:46:57 localhost systemd[1]: tmp-crun.YaecEV.mount: Deactivated successfully. Feb 23 04:46:57 localhost podman[297459]: 2026-02-23 09:46:57.924181351 +0000 UTC m=+0.088500265 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, managed_by=edpm_ansible) Feb 23 04:46:57 localhost podman[297459]: 2026-02-23 09:46:57.936252248 +0000 UTC m=+0.100571162 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:46:57 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:46:58 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:58 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:58 localhost ceph-mon[294160]: Reconfiguring mon.np0005626461 (monmap changed)... Feb 23 04:46:58 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:58 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain Feb 23 04:46:58 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:59 localhost podman[297531]: Feb 23 04:46:59 localhost podman[297531]: 2026-02-23 09:46:59.226499726 +0000 UTC m=+0.046053072 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:46:59 localhost podman[297531]: 2026-02-23 09:46:59.76315338 +0000 UTC m=+0.582706676 container create ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.42.2, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main) Feb 23 04:46:59 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:46:59 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:46:59 localhost systemd[1]: Started libpod-conmon-ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599.scope. Feb 23 04:46:59 localhost systemd[1]: Started libcrun container. Feb 23 04:46:59 localhost podman[297531]: 2026-02-23 09:46:59.826033773 +0000 UTC m=+0.645587089 container init ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.buildah.version=1.42.2, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:46:59 localhost podman[297531]: 2026-02-23 09:46:59.839694259 +0000 UTC m=+0.659247585 container start ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, io.buildah.version=1.42.2, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:46:59 localhost podman[297531]: 2026-02-23 09:46:59.840180863 +0000 UTC m=+0.659734179 container attach ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, description=Red Hat Ceph Storage 7, release=1770267347, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git) Feb 23 04:46:59 localhost goofy_dirac[297544]: 167 167 Feb 23 04:46:59 localhost systemd[1]: libpod-ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599.scope: Deactivated successfully. Feb 23 04:46:59 localhost podman[297531]: 2026-02-23 09:46:59.843559407 +0000 UTC m=+0.663112783 container died ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.42.2, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:46:59 localhost nova_compute[282206]: 2026-02-23 09:46:59.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:59 localhost nova_compute[282206]: 2026-02-23 09:46:59.883 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:46:59 localhost nova_compute[282206]: 2026-02-23 09:46:59.883 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:46:59 localhost nova_compute[282206]: 2026-02-23 09:46:59.883 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:59 localhost nova_compute[282206]: 2026-02-23 09:46:59.917 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:46:59 localhost nova_compute[282206]: 2026-02-23 09:46:59.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:46:59 localhost podman[297549]: 2026-02-23 09:46:59.975828552 +0000 UTC m=+0.121765717 container remove ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_dirac, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, distribution-scope=public, vcs-type=git, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:46:59 localhost systemd[1]: libpod-conmon-ea0065d3ce8bcefdd2e57f9fbd4fc69b8d4474babdbc2ef48efe6bd405c04599.scope: Deactivated successfully. Feb 23 04:47:00 localhost systemd[1]: var-lib-containers-storage-overlay-5e1f596d40901f80883bc707021c73cb009df1f5f19e7aed28a829b48a673fdd-merged.mount: Deactivated successfully. Feb 23 04:47:01 localhost ceph-mon[294160]: mon.np0005626463@4(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:01 localhost ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:47:01 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:47:01 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:01 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:01 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:01 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:47:01 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:04 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5080 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0 Feb 23 04:47:04 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Feb 23 04:47:04 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0 Feb 23 04:47:04 localhost ceph-mon[294160]: mon.np0005626463@4(peon) e10 my rank is now 3 (was 4) Feb 23 04:47:04 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e51e0 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0 Feb 23 04:47:04 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:47:04 localhost ceph-mon[294160]: paxos.3).electionLogic(42) init, last seen epoch 42 Feb 23 04:47:04 localhost ceph-mon[294160]: mon.np0005626463@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:47:04 localhost ceph-mon[294160]: mon.np0005626463@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:47:04 localhost nova_compute[282206]: 2026-02-23 09:47:04.919 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:04 localhost nova_compute[282206]: 2026-02-23 09:47:04.921 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:04 localhost nova_compute[282206]: 2026-02-23 09:47:04.921 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:04 localhost nova_compute[282206]: 2026-02-23 09:47:04.922 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:04 localhost nova_compute[282206]: 2026-02-23 09:47:04.972 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:04 localhost nova_compute[282206]: 2026-02-23 09:47:04.972 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:06 localhost ceph-mon[294160]: mon.np0005626463@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:47:06 localhost ceph-mon[294160]: mon.np0005626463@3(peon) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:47:06 localhost ceph-mon[294160]: Remove daemons mon.np0005626460 Feb 23 04:47:06 localhost ceph-mon[294160]: Safe to remove mon.np0005626460: new quorum should be ['np0005626461', 'np0005626466', 'np0005626465', 'np0005626463'] (from ['np0005626461', 'np0005626466', 'np0005626465', 'np0005626463']) Feb 23 04:47:06 localhost ceph-mon[294160]: Removing monitor np0005626460 from monmap... Feb 23 04:47:06 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "mon rm", "name": "np0005626460"} : dispatch Feb 23 04:47:06 localhost ceph-mon[294160]: Removing daemon mon.np0005626460 from np0005626460.localdomain -- ports [] Feb 23 04:47:06 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:47:06 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:47:06 localhost ceph-mon[294160]: mon.np0005626465 calling monitor election Feb 23 04:47:06 localhost ceph-mon[294160]: mon.np0005626461 calling monitor election Feb 23 04:47:06 localhost ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626466,np0005626465,np0005626463 in quorum (ranks 0,1,2,3) Feb 23 04:47:06 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:47:06 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:47:06 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:06 localhost sshd[297658]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626460.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:07 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mds[286877]: mds.beacon.mds.np0005626463.qcthuc missed beacon ack from the monitors Feb 23 04:47:08 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:08 localhost ceph-mon[294160]: Removed label mon from host np0005626460.localdomain Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626460", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:09 localhost podman[242954]: time="2026-02-23T09:47:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:47:09 localhost podman[242954]: @ - - [23/Feb/2026:09:47:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:47:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:47:09 localhost podman[242954]: @ - - [23/Feb/2026:09:47:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18265 "" "Go-http-client/1.1" Feb 23 04:47:09 localhost podman[297922]: 2026-02-23 09:47:09.528686393 +0000 UTC m=+0.086010919 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:47:09 localhost podman[297922]: 2026-02-23 09:47:09.543318888 +0000 UTC m=+0.100643404 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:47:09 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:47:09 localhost ceph-mon[294160]: Reconfiguring crash.np0005626460 (monmap changed)... Feb 23 04:47:09 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626460 on np0005626460.localdomain Feb 23 04:47:09 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:09 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:09 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:09 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:09 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626460.fyrady", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:09 localhost nova_compute[282206]: 2026-02-23 09:47:09.974 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:09 localhost nova_compute[282206]: 2026-02-23 09:47:09.975 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:09 localhost nova_compute[282206]: 2026-02-23 09:47:09.976 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:09 localhost nova_compute[282206]: 2026-02-23 09:47:09.976 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:10 localhost nova_compute[282206]: 2026-02-23 09:47:10.020 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:10 localhost nova_compute[282206]: 2026-02-23 09:47:10.021 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.950973) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840030951032, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 1161, "num_deletes": 261, "total_data_size": 2570671, "memory_usage": 2632496, "flush_reason": "Manual Compaction"} Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840030963319, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 1548075, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12435, "largest_seqno": 13591, "table_properties": {"data_size": 1542467, "index_size": 2823, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14474, "raw_average_key_size": 21, "raw_value_size": 1530230, "raw_average_value_size": 2290, "num_data_blocks": 119, "num_entries": 668, "num_filter_entries": 668, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840011, "oldest_key_time": 1771840011, "file_creation_time": 1771840030, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 12400 microseconds, and 3235 cpu microseconds. Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.963377) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 1548075 bytes OK Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.963404) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.967328) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.967356) EVENT_LOG_v1 {"time_micros": 1771840030967349, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.967379) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 2564455, prev total WAL file size 2564779, number of live WAL files 2. Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.971305) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353133' seq:72057594037927935, type:22 .. '6C6F676D0033373639' seq:0, type:0; will stop at (end) Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(1511KB)], [18(16MB)] Feb 23 04:47:10 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840030971359, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 18543343, "oldest_snapshot_seqno": -1} Feb 23 04:47:11 localhost ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10402 keys, 18390887 bytes, temperature: kUnknown Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840031123657, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 18390887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18328410, "index_size": 35368, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 279469, "raw_average_key_size": 26, "raw_value_size": 18147236, "raw_average_value_size": 1744, "num_data_blocks": 1353, "num_entries": 10402, "num_filter_entries": 10402, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840030, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.124082) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 18390887 bytes Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.128356) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 121.6 rd, 120.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 16.2 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(23.9) write-amplify(11.9) OK, records in: 10959, records dropped: 557 output_compression: NoCompression Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.128388) EVENT_LOG_v1 {"time_micros": 1771840031128374, "job": 8, "event": "compaction_finished", "compaction_time_micros": 152446, "compaction_time_cpu_micros": 51845, "output_level": 6, "num_output_files": 1, "total_output_size": 18390887, "num_input_records": 10959, "num_output_records": 10402, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840031128924, "job": 8, "event": "table_file_deletion", "file_number": 20} Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840031131506, "job": 8, "event": "table_file_deletion", "file_number": 18} Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:10.971199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131641) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131647) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131653) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:11.131656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:11 localhost ceph-mon[294160]: Removed label mgr from host np0005626460.localdomain Feb 23 04:47:11 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626460.fyrady (monmap changed)... Feb 23 04:47:11 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626460.fyrady on np0005626460.localdomain Feb 23 04:47:11 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:11 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:11 localhost ceph-mon[294160]: Reconfiguring mon.np0005626461 (monmap changed)... Feb 23 04:47:11 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:47:11 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain Feb 23 04:47:11 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:12 localhost nova_compute[282206]: 2026-02-23 09:47:12.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:12 localhost nova_compute[282206]: 2026-02-23 09:47:12.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:47:12 localhost nova_compute[282206]: 2026-02-23 09:47:12.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:47:12 localhost ceph-mon[294160]: Removed label _admin from host np0005626460.localdomain Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:12 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:13 localhost nova_compute[282206]: 2026-02-23 09:47:13.003 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:47:13 localhost nova_compute[282206]: 2026-02-23 09:47:13.004 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:47:13 localhost nova_compute[282206]: 2026-02-23 09:47:13.004 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:47:13 localhost nova_compute[282206]: 2026-02-23 09:47:13.004 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:47:13 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:47:13 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:47:13 localhost ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:47:13 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:47:13 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:13 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:13 localhost ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:47:13 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:13 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:13 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:47:13 localhost openstack_network_exporter[245358]: ERROR 09:47:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:47:13 localhost openstack_network_exporter[245358]: Feb 23 04:47:13 localhost openstack_network_exporter[245358]: ERROR 09:47:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:47:13 localhost openstack_network_exporter[245358]: Feb 23 04:47:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:47:13 localhost podman[297996]: 2026-02-23 09:47:13.833365335 +0000 UTC m=+0.085273087 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=) Feb 23 04:47:13 localhost podman[297996]: 2026-02-23 09:47:13.84830854 +0000 UTC m=+0.100216262 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, architecture=x86_64, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 23 04:47:13 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:47:13 localhost podman[298004]: Feb 23 04:47:13 localhost podman[298004]: 2026-02-23 09:47:13.93342385 +0000 UTC m=+0.163879639 container create 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, version=7, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64) Feb 23 04:47:13 localhost systemd[1]: Started libpod-conmon-4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4.scope. Feb 23 04:47:14 localhost podman[298004]: 2026-02-23 09:47:13.907777089 +0000 UTC m=+0.138232868 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:14 localhost nova_compute[282206]: 2026-02-23 09:47:14.006 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:47:14 localhost systemd[1]: Started libcrun container. Feb 23 04:47:14 localhost podman[298004]: 2026-02-23 09:47:14.02478019 +0000 UTC m=+0.255235969 container init 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, release=1770267347, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.openshift.expose-services=, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main) Feb 23 04:47:14 localhost nova_compute[282206]: 2026-02-23 09:47:14.034 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:47:14 localhost nova_compute[282206]: 2026-02-23 09:47:14.034 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:47:14 localhost cool_rhodes[298035]: 167 167 Feb 23 04:47:14 localhost systemd[1]: libpod-4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4.scope: Deactivated successfully. Feb 23 04:47:14 localhost podman[298004]: 2026-02-23 09:47:14.03823484 +0000 UTC m=+0.268690629 container start 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, version=7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True) Feb 23 04:47:14 localhost podman[298004]: 2026-02-23 09:47:14.038484058 +0000 UTC m=+0.268939857 container attach 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vendor=Red Hat, Inc., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:47:14 localhost podman[298004]: 2026-02-23 09:47:14.040646844 +0000 UTC m=+0.271102623 container died 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64) Feb 23 04:47:14 localhost nova_compute[282206]: 2026-02-23 09:47:14.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:14 localhost nova_compute[282206]: 2026-02-23 09:47:14.073 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:14 localhost podman[298040]: 2026-02-23 09:47:14.132129377 +0000 UTC m=+0.084934455 container remove 4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_rhodes, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, release=1770267347, io.buildah.version=1.42.2, architecture=x86_64, name=rhceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux ) Feb 23 04:47:14 localhost systemd[1]: libpod-conmon-4b5da48da9cda6c7a0e01fc47e88d86c55405112aa3b15f082ef48e58c1fa3c4.scope: Deactivated successfully. Feb 23 04:47:14 localhost podman[298108]: Feb 23 04:47:14 localhost podman[298108]: 2026-02-23 09:47:14.804536392 +0000 UTC m=+0.074447526 container create 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:47:14 localhost systemd[1]: var-lib-containers-storage-overlay-8cffeeb07020634fd5a1dcbf109b156005d2467919d0716ab65955ebeb38f74e-merged.mount: Deactivated successfully. Feb 23 04:47:14 localhost systemd[1]: Started libpod-conmon-6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5.scope. Feb 23 04:47:14 localhost systemd[1]: Started libcrun container. Feb 23 04:47:14 localhost podman[298108]: 2026-02-23 09:47:14.775608902 +0000 UTC m=+0.045520076 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:14 localhost podman[298108]: 2026-02-23 09:47:14.883703891 +0000 UTC m=+0.153615015 container init 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, build-date=2026-02-09T10:25:24Z) Feb 23 04:47:14 localhost vigorous_carver[298124]: 167 167 Feb 23 04:47:14 localhost podman[298108]: 2026-02-23 09:47:14.895572023 +0000 UTC m=+0.165483157 container start 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, version=7) Feb 23 04:47:14 localhost podman[298108]: 2026-02-23 09:47:14.895965155 +0000 UTC m=+0.165876319 container attach 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, ceph=True, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, RELEASE=main) Feb 23 04:47:14 localhost systemd[1]: libpod-6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5.scope: Deactivated successfully. Feb 23 04:47:14 localhost podman[298108]: 2026-02-23 09:47:14.900752151 +0000 UTC m=+0.170663285 container died 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:47:15 localhost podman[298129]: 2026-02-23 09:47:15.050625402 +0000 UTC m=+0.144209150 container remove 6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_carver, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.42.2, GIT_CLEAN=True, name=rhceph, RELEASE=main, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=) Feb 23 04:47:15 localhost nova_compute[282206]: 2026-02-23 09:47:15.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:15 localhost nova_compute[282206]: 2026-02-23 09:47:15.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:15 localhost nova_compute[282206]: 2026-02-23 09:47:15.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:15 localhost nova_compute[282206]: 2026-02-23 09:47:15.057 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:15 localhost nova_compute[282206]: 2026-02-23 09:47:15.057 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:15 localhost systemd[1]: libpod-conmon-6a1e9f239e7881ef12008136f3a9816cb429ef48c9dd8ec386b913da44593aa5.scope: Deactivated successfully. Feb 23 04:47:15 localhost nova_compute[282206]: 2026-02-23 09:47:15.059 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:15 localhost nova_compute[282206]: 2026-02-23 09:47:15.060 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:15 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:15 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:15 localhost ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:47:15 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:47:15 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:47:15 localhost systemd[1]: var-lib-containers-storage-overlay-4eee731cae5fdc18be0c10d2891fe31ef65d4d7fe265dcce23a01754e5da6e90-merged.mount: Deactivated successfully. Feb 23 04:47:15 localhost podman[298206]: Feb 23 04:47:15 localhost podman[298206]: 2026-02-23 09:47:15.896949739 +0000 UTC m=+0.075854139 container create c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , version=7, CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.buildah.version=1.42.2, release=1770267347, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph) Feb 23 04:47:15 localhost systemd[1]: Started libpod-conmon-c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202.scope. Feb 23 04:47:15 localhost systemd[1]: Started libcrun container. Feb 23 04:47:15 localhost podman[298206]: 2026-02-23 09:47:15.866388119 +0000 UTC m=+0.045292569 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:15 localhost podman[298206]: 2026-02-23 09:47:15.96729052 +0000 UTC m=+0.146194930 container init c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True) Feb 23 04:47:15 localhost podman[298206]: 2026-02-23 09:47:15.977771689 +0000 UTC m=+0.156676099 container start c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, GIT_BRANCH=main, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True) Feb 23 04:47:15 localhost clever_kare[298221]: 167 167 Feb 23 04:47:15 localhost systemd[1]: libpod-c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202.scope: Deactivated successfully. Feb 23 04:47:15 localhost podman[298206]: 2026-02-23 09:47:15.979404089 +0000 UTC m=+0.158308489 container attach c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True) Feb 23 04:47:15 localhost podman[298206]: 2026-02-23 09:47:15.984419531 +0000 UTC m=+0.163323931 container died c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.42.2) Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:16 localhost podman[298226]: 2026-02-23 09:47:16.074100461 +0000 UTC m=+0.080924933 container remove c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=clever_kare, io.buildah.version=1.42.2, version=7, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.077 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.077 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.079 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.080 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:47:16 localhost systemd[1]: libpod-conmon-c3d69c241f38bd730468dd9eb786baa4991cf5be4edfe2c6c480b49333804202.scope: Deactivated successfully. Feb 23 04:47:16 localhost ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:16 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:16 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:16 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:47:16 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.511 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.587 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.588 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.796 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.798 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11807MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.799 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.799 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:47:16 localhost systemd[1]: var-lib-containers-storage-overlay-f2d55bfe64961b6a0033d5af37241c0299c9a0526d0cd67a6752868fa1a936d9-merged.mount: Deactivated successfully. Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.876 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.877 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.877 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:47:16 localhost nova_compute[282206]: 2026-02-23 09:47:16.925 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:47:16 localhost podman[298324]: Feb 23 04:47:16 localhost podman[298324]: 2026-02-23 09:47:16.941136239 +0000 UTC m=+0.079466849 container create 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, distribution-scope=public) Feb 23 04:47:16 localhost systemd[1]: Started libpod-conmon-9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77.scope. Feb 23 04:47:16 localhost systemd[1]: Started libcrun container. Feb 23 04:47:17 localhost podman[298324]: 2026-02-23 09:47:17.007673054 +0000 UTC m=+0.146003664 container init 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, name=rhceph) Feb 23 04:47:17 localhost podman[298324]: 2026-02-23 09:47:16.910370463 +0000 UTC m=+0.048701113 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:17 localhost podman[298324]: 2026-02-23 09:47:17.023783114 +0000 UTC m=+0.162113724 container start 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux , name=rhceph, CEPH_POINT_RELEASE=, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:47:17 localhost podman[298324]: 2026-02-23 09:47:17.024031381 +0000 UTC m=+0.162362041 container attach 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, architecture=x86_64, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, distribution-scope=public, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux ) Feb 23 04:47:17 localhost kind_mestorf[298339]: 167 167 Feb 23 04:47:17 localhost systemd[1]: libpod-9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77.scope: Deactivated successfully. Feb 23 04:47:17 localhost podman[298324]: 2026-02-23 09:47:17.02723453 +0000 UTC m=+0.165565160 container died 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-type=git, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=1770267347, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:47:17 localhost podman[298344]: 2026-02-23 09:47:17.10973357 +0000 UTC m=+0.069235558 container remove 9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mestorf, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, release=1770267347, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public) Feb 23 04:47:17 localhost systemd[1]: libpod-conmon-9e58a899972f32947268d546395271db818a1fea4b277e859a7625b8ef46ca77.scope: Deactivated successfully. Feb 23 04:47:17 localhost ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)... Feb 23 04:47:17 localhost ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:47:17 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:17 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:47:17 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:17 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:17 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:47:17 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:17 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:17 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:47:17 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:17 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:17 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:47:17 localhost ceph-mon[294160]: mon.np0005626463@3(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:47:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/762953382' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:47:17 localhost nova_compute[282206]: 2026-02-23 09:47:17.389 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:47:17 localhost nova_compute[282206]: 2026-02-23 09:47:17.397 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:47:17 localhost nova_compute[282206]: 2026-02-23 09:47:17.422 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:47:17 localhost nova_compute[282206]: 2026-02-23 09:47:17.424 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:47:17 localhost nova_compute[282206]: 2026-02-23 09:47:17.425 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.625s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:47:17 localhost podman[298433]: Feb 23 04:47:17 localhost systemd[1]: tmp-crun.IzyYO4.mount: Deactivated successfully. Feb 23 04:47:17 localhost systemd[1]: var-lib-containers-storage-overlay-919defabe99933a0a258bd21b4a2c3d2b7cdde0f5b23926e44268ee7cf387d34-merged.mount: Deactivated successfully. Feb 23 04:47:17 localhost podman[298433]: 2026-02-23 09:47:17.82962922 +0000 UTC m=+0.088718411 container create ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, version=7, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_BRANCH=main, io.buildah.version=1.42.2, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:47:17 localhost systemd[1]: Started libpod-conmon-ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5.scope. Feb 23 04:47:17 localhost systemd[1]: Started libcrun container. Feb 23 04:47:17 localhost podman[298433]: 2026-02-23 09:47:17.788422526 +0000 UTC m=+0.047511747 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:17 localhost podman[298433]: 2026-02-23 09:47:17.90485609 +0000 UTC m=+0.163945321 container init ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph) Feb 23 04:47:17 localhost podman[298433]: 2026-02-23 09:47:17.912466512 +0000 UTC m=+0.171555713 container start ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, distribution-scope=public, CEPH_POINT_RELEASE=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True) Feb 23 04:47:17 localhost podman[298433]: 2026-02-23 09:47:17.912947496 +0000 UTC m=+0.172036787 container attach ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, name=rhceph, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1770267347, RELEASE=main, com.redhat.component=rhceph-container) Feb 23 04:47:17 localhost amazing_williams[298448]: 167 167 Feb 23 04:47:17 localhost systemd[1]: libpod-ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5.scope: Deactivated successfully. Feb 23 04:47:17 localhost podman[298433]: 2026-02-23 09:47:17.916081131 +0000 UTC m=+0.175170332 container died ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, io.openshift.expose-services=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:47:18 localhost podman[298453]: 2026-02-23 09:47:18.019103177 +0000 UTC m=+0.090102513 container remove ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_williams, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Feb 23 04:47:18 localhost systemd[1]: libpod-conmon-ebcf31195723dadb5265b7dcc1b65000be597a9629273c58c556bc40b26372e5.scope: Deactivated successfully. Feb 23 04:47:18 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:18 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:18 localhost ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:47:18 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:47:18 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:47:18 localhost nova_compute[282206]: 2026-02-23 09:47:18.424 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:18 localhost nova_compute[282206]: 2026-02-23 09:47:18.425 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:18 localhost nova_compute[282206]: 2026-02-23 09:47:18.425 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:47:18 localhost nova_compute[282206]: 2026-02-23 09:47:18.425 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:47:18 localhost podman[298523]: Feb 23 04:47:18 localhost podman[298523]: 2026-02-23 09:47:18.696322037 +0000 UTC m=+0.075211739 container create 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:47:18 localhost systemd[1]: Started libpod-conmon-4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932.scope. Feb 23 04:47:18 localhost systemd[1]: Started libcrun container. Feb 23 04:47:18 localhost podman[298523]: 2026-02-23 09:47:18.761945535 +0000 UTC m=+0.140835227 container init 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public) Feb 23 04:47:18 localhost podman[298523]: 2026-02-23 09:47:18.665047526 +0000 UTC m=+0.043937238 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:18 localhost podman[298523]: 2026-02-23 09:47:18.769440774 +0000 UTC m=+0.148330466 container start 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, maintainer=Guillaume Abrioux , ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, release=1770267347, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_CLEAN=True) Feb 23 04:47:18 localhost podman[298523]: 2026-02-23 09:47:18.76965533 +0000 UTC m=+0.148545042 container attach 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, ceph=True, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, vcs-type=git, io.openshift.expose-services=) Feb 23 04:47:18 localhost nifty_archimedes[298539]: 167 167 Feb 23 04:47:18 localhost systemd[1]: libpod-4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932.scope: Deactivated successfully. Feb 23 04:47:18 localhost podman[298523]: 2026-02-23 09:47:18.77163503 +0000 UTC m=+0.150524752 container died 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, ceph=True, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, release=1770267347, version=7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container) Feb 23 04:47:18 localhost systemd[1]: var-lib-containers-storage-overlay-ecaa1c383fa4bc26139d8ccc82e0cbd570b8b02a52542aecba0e731f2ceeddc0-merged.mount: Deactivated successfully. Feb 23 04:47:18 localhost systemd[1]: var-lib-containers-storage-overlay-048046cdb304d4b4b16d8ee31efbca59063103a54e3e2012616f105b0f3e6f4f-merged.mount: Deactivated successfully. Feb 23 04:47:18 localhost podman[298544]: 2026-02-23 09:47:18.886926898 +0000 UTC m=+0.103196491 container remove 4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, release=1770267347) Feb 23 04:47:18 localhost systemd[1]: libpod-conmon-4e260e59072219a7fc60e3a60394d6972bef3e4481460e7cd128be3a12b3f932.scope: Deactivated successfully. Feb 23 04:47:19 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:19 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:19 localhost ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:47:19 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:19 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:19 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:47:19 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:19 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:19 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:47:20 localhost nova_compute[282206]: 2026-02-23 09:47:20.060 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:20 localhost nova_compute[282206]: 2026-02-23 09:47:20.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:20 localhost nova_compute[282206]: 2026-02-23 09:47:20.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:20 localhost nova_compute[282206]: 2026-02-23 09:47:20.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:20 localhost nova_compute[282206]: 2026-02-23 09:47:20.088 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:20 localhost nova_compute[282206]: 2026-02-23 09:47:20.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:21 localhost ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:47:21 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:47:21 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:21 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:21 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:47:21 localhost ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:47:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:47:21 localhost podman[298561]: 2026-02-23 09:47:21.912040107 +0000 UTC m=+0.085262916 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:47:21 localhost podman[298561]: 2026-02-23 09:47:21.956350715 +0000 UTC m=+0.129573534 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:47:21 localhost podman[298562]: 2026-02-23 09:47:21.957020005 +0000 UTC m=+0.127992425 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:47:21 localhost podman[298562]: 2026-02-23 09:47:21.971227088 +0000 UTC m=+0.142199528 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:47:21 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:47:22 localhost ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:47:22 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:47:22 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:22 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:22 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:22 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:22 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.138075) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042138138, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 723, "num_deletes": 251, "total_data_size": 1044496, "memory_usage": 1057536, "flush_reason": "Manual Compaction"} Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042146767, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 614951, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13596, "largest_seqno": 14314, "table_properties": {"data_size": 611089, "index_size": 1589, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10138, "raw_average_key_size": 21, "raw_value_size": 603016, "raw_average_value_size": 1288, "num_data_blocks": 66, "num_entries": 468, "num_filter_entries": 468, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840030, "oldest_key_time": 1771840030, "file_creation_time": 1771840042, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 8738 microseconds, and 3242 cpu microseconds. Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.146814) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 614951 bytes OK Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.146838) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.148768) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.148791) EVENT_LOG_v1 {"time_micros": 1771840042148785, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.148811) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1040324, prev total WAL file size 1042176, number of live WAL files 2. Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.150468) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end) Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(600KB)], [21(17MB)] Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042150540, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19005838, "oldest_snapshot_seqno": -1} Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10346 keys, 15653784 bytes, temperature: kUnknown Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042243864, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15653784, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15593196, "index_size": 33607, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25925, "raw_key_size": 279211, "raw_average_key_size": 26, "raw_value_size": 15414506, "raw_average_value_size": 1489, "num_data_blocks": 1277, "num_entries": 10346, "num_filter_entries": 10346, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840042, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.244231) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15653784 bytes Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.267474) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.4 rd, 167.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 17.5 +0.0 blob) out(14.9 +0.0 blob), read-write-amplify(56.4) write-amplify(25.5) OK, records in: 10870, records dropped: 524 output_compression: NoCompression Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.267508) EVENT_LOG_v1 {"time_micros": 1771840042267491, "job": 10, "event": "compaction_finished", "compaction_time_micros": 93449, "compaction_time_cpu_micros": 43555, "output_level": 6, "num_output_files": 1, "total_output_size": 15653784, "num_input_records": 10870, "num_output_records": 10346, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042267921, "job": 10, "event": "table_file_deletion", "file_number": 23} Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840042270691, "job": 10, "event": "table_file_deletion", "file_number": 21} Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.150396) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270808) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270814) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270817) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:22 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:47:22.270823) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:47:22 localhost sshd[298608]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:47:23 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:47:23 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:47:23 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:23 localhost ceph-mon[294160]: Added label _no_schedule to host np0005626460.localdomain Feb 23 04:47:23 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:23 localhost ceph-mon[294160]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626460.localdomain Feb 23 04:47:23 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:23 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:23 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:23 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:24 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:47:24 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:47:24 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:24 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:24 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:47:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:47:24 localhost podman[298610]: 2026-02-23 09:47:24.907416961 +0000 UTC m=+0.081059689 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:47:24 localhost podman[298610]: 2026-02-23 09:47:24.946367335 +0000 UTC m=+0.120010083 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:47:24 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:47:25 localhost nova_compute[282206]: 2026-02-23 09:47:25.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:25 localhost nova_compute[282206]: 2026-02-23 09:47:25.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:25 localhost nova_compute[282206]: 2026-02-23 09:47:25.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:25 localhost nova_compute[282206]: 2026-02-23 09:47:25.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:25 localhost nova_compute[282206]: 2026-02-23 09:47:25.122 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:25 localhost nova_compute[282206]: 2026-02-23 09:47:25.122 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:25 localhost ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)... Feb 23 04:47:25 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:47:25 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:25 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:25 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:25 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:25 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:25 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain"} : dispatch Feb 23 04:47:25 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain"} : dispatch Feb 23 04:47:25 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain"}]': finished Feb 23 04:47:26 localhost ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:26 localhost ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:47:26 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:47:26 localhost ceph-mon[294160]: Removed host np0005626460.localdomain Feb 23 04:47:26 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:26 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:26 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:47:27 localhost ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:47:27 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:47:27 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:27 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:27 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:47:28 localhost ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)... Feb 23 04:47:28 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:47:28 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:28 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:28 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:28 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:47:28 localhost podman[298628]: 2026-02-23 09:47:28.946168926 +0000 UTC m=+0.115892748 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent) Feb 23 04:47:28 localhost podman[298628]: 2026-02-23 09:47:28.980230363 +0000 UTC m=+0.149954215 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:47:28 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:47:29 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:47:29 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:47:29 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:29 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:29 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:47:29 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:29 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:29 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:47:29 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:29 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:29 localhost ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:47:29 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:47:29 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:47:30 localhost nova_compute[282206]: 2026-02-23 09:47:30.123 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:30 localhost nova_compute[282206]: 2026-02-23 09:47:30.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:30 localhost nova_compute[282206]: 2026-02-23 09:47:30.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:30 localhost nova_compute[282206]: 2026-02-23 09:47:30.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:30 localhost nova_compute[282206]: 2026-02-23 09:47:30.163 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:30 localhost nova_compute[282206]: 2026-02-23 09:47:30.164 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:30 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:30 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:30 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:47:30 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:31 localhost ceph-mon[294160]: mon.np0005626463@3(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:32 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:33 localhost ceph-mon[294160]: Saving service mon spec with placement label:mon Feb 23 04:47:33 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:33 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:47:33 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:34 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea593e5600 mon_map magic: 0 from mon.2 v2:172.18.0.107:3300/0 Feb 23 04:47:34 localhost ceph-mon[294160]: mon.np0005626463@3(peon) e11 my rank is now 2 (was 3) Feb 23 04:47:34 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:47:34 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:47:34 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:47:34 localhost ceph-mon[294160]: paxos.2).electionLogic(44) init, last seen epoch 44 Feb 23 04:47:34 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x55ea62c4e000 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Feb 23 04:47:34 localhost ceph-mon[294160]: mon.np0005626463@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:47:35 localhost nova_compute[282206]: 2026-02-23 09:47:35.165 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:35 localhost nova_compute[282206]: 2026-02-23 09:47:35.166 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:35 localhost nova_compute[282206]: 2026-02-23 09:47:35.166 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:35 localhost nova_compute[282206]: 2026-02-23 09:47:35.166 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:35 localhost nova_compute[282206]: 2026-02-23 09:47:35.198 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:35 localhost nova_compute[282206]: 2026-02-23 09:47:35.199 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:36 localhost sshd[298682]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:47:39 localhost podman[242954]: time="2026-02-23T09:47:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:47:39 localhost podman[242954]: @ - - [23/Feb/2026:09:47:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:47:39 localhost podman[242954]: @ - - [23/Feb/2026:09:47:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18262 "" "Go-http-client/1.1" Feb 23 04:47:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:47:39 localhost ceph-mon[294160]: paxos.2).electionLogic(45) init, last seen epoch 45, mid-election, bumping Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626463@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626463@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:47:39 localhost systemd[1]: tmp-crun.BIXF8w.mount: Deactivated successfully. Feb 23 04:47:39 localhost podman[298684]: 2026-02-23 09:47:39.917759537 +0000 UTC m=+0.088084802 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:47:39 localhost podman[298684]: 2026-02-23 09:47:39.950630897 +0000 UTC m=+0.120956192 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:47:39 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:47:39 localhost ceph-mon[294160]: Remove daemons mon.np0005626465 Feb 23 04:47:39 localhost ceph-mon[294160]: Safe to remove mon.np0005626465: new quorum should be ['np0005626461', 'np0005626466', 'np0005626463'] (from ['np0005626461', 'np0005626466', 'np0005626463']) Feb 23 04:47:39 localhost ceph-mon[294160]: Removing monitor np0005626465 from monmap... Feb 23 04:47:39 localhost ceph-mon[294160]: Removing daemon mon.np0005626465 from np0005626465.localdomain -- ports [] Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626461 calling monitor election Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626466 in quorum (ranks 0,1) Feb 23 04:47:39 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626461 calling monitor election Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:47:39 localhost ceph-mon[294160]: mon.np0005626461 is new leader, mons np0005626461,np0005626466,np0005626463 in quorum (ranks 0,1,2) Feb 23 04:47:39 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:47:39 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:47:39 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:40 localhost nova_compute[282206]: 2026-02-23 09:47:40.200 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:40 localhost nova_compute[282206]: 2026-02-23 09:47:40.201 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:40 localhost nova_compute[282206]: 2026-02-23 09:47:40.201 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:40 localhost nova_compute[282206]: 2026-02-23 09:47:40.201 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:40 localhost nova_compute[282206]: 2026-02-23 09:47:40.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:40 localhost nova_compute[282206]: 2026-02-23 09:47:40.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:41 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:41 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:41 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:41 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:47:41 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:42 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:42 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:42 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:42 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:42 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:43 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:47:43 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:47:43 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:43 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:43 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:43 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:43 localhost openstack_network_exporter[245358]: ERROR 09:47:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:47:43 localhost openstack_network_exporter[245358]: Feb 23 04:47:43 localhost openstack_network_exporter[245358]: ERROR 09:47:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:47:43 localhost openstack_network_exporter[245358]: Feb 23 04:47:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:47:44 localhost podman[299063]: 2026-02-23 09:47:44.014129139 +0000 UTC m=+0.067054861 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 23 04:47:44 localhost podman[299063]: 2026-02-23 09:47:44.027219238 +0000 UTC m=+0.080144950 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:47:44 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:47:44 localhost ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:47:44 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:47:44 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:44 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:44 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:44 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:44 localhost podman[299120]: Feb 23 04:47:44 localhost podman[299120]: 2026-02-23 09:47:44.404986366 +0000 UTC m=+0.061064380 container create 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7) Feb 23 04:47:44 localhost systemd[1]: Started libpod-conmon-189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc.scope. Feb 23 04:47:44 localhost systemd[1]: Started libcrun container. Feb 23 04:47:44 localhost podman[299120]: 2026-02-23 09:47:44.373367172 +0000 UTC m=+0.029445206 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:44 localhost podman[299120]: 2026-02-23 09:47:44.473195451 +0000 UTC m=+0.129273465 container init 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_BRANCH=main, release=1770267347, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=) Feb 23 04:47:44 localhost podman[299120]: 2026-02-23 09:47:44.482761822 +0000 UTC m=+0.138839836 container start 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.buildah.version=1.42.2, RELEASE=main, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, distribution-scope=public, com.redhat.component=rhceph-container) Feb 23 04:47:44 localhost podman[299120]: 2026-02-23 09:47:44.483053501 +0000 UTC m=+0.139131555 container attach 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True) Feb 23 04:47:44 localhost confident_gagarin[299136]: 167 167 Feb 23 04:47:44 localhost systemd[1]: libpod-189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc.scope: Deactivated successfully. Feb 23 04:47:44 localhost podman[299120]: 2026-02-23 09:47:44.48599914 +0000 UTC m=+0.142077164 container died 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, vcs-type=git, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 23 04:47:44 localhost podman[299141]: 2026-02-23 09:47:44.574226126 +0000 UTC m=+0.076886302 container remove 189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gagarin, release=1770267347, ceph=True, io.buildah.version=1.42.2, vcs-type=git, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container) Feb 23 04:47:44 localhost systemd[1]: libpod-conmon-189b7ee2f67fc381bd6938fffb0f33ee90ba6164b6cde581cc16eafc29d640dc.scope: Deactivated successfully. Feb 23 04:47:45 localhost systemd[1]: var-lib-containers-storage-overlay-2b81b50680d3517ec08aedb96ce724e8acdef65e24c8832a4bc0eaa323b463f7-merged.mount: Deactivated successfully. Feb 23 04:47:45 localhost ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:47:45 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:47:45 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:45 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:45 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:47:45 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:45 localhost nova_compute[282206]: 2026-02-23 09:47:45.240 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:45 localhost nova_compute[282206]: 2026-02-23 09:47:45.242 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:45 localhost nova_compute[282206]: 2026-02-23 09:47:45.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:45 localhost nova_compute[282206]: 2026-02-23 09:47:45.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:45 localhost podman[299211]: Feb 23 04:47:45 localhost nova_compute[282206]: 2026-02-23 09:47:45.277 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:45 localhost nova_compute[282206]: 2026-02-23 09:47:45.277 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:45 localhost podman[299211]: 2026-02-23 09:47:45.281998967 +0000 UTC m=+0.085013729 container create 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, release=1770267347, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:47:45 localhost systemd[1]: Started libpod-conmon-1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13.scope. Feb 23 04:47:45 localhost systemd[1]: Started libcrun container. Feb 23 04:47:45 localhost podman[299211]: 2026-02-23 09:47:45.241673859 +0000 UTC m=+0.044688641 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:45 localhost podman[299211]: 2026-02-23 09:47:45.341490388 +0000 UTC m=+0.144505150 container init 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, release=1770267347, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph) Feb 23 04:47:45 localhost podman[299211]: 2026-02-23 09:47:45.350159441 +0000 UTC m=+0.153174233 container start 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, version=7, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2026-02-09T10:25:24Z, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, ceph=True, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:47:45 localhost podman[299211]: 2026-02-23 09:47:45.350963446 +0000 UTC m=+0.153978208 container attach 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, CEPH_POINT_RELEASE=) Feb 23 04:47:45 localhost recursing_franklin[299225]: 167 167 Feb 23 04:47:45 localhost systemd[1]: libpod-1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13.scope: Deactivated successfully. Feb 23 04:47:45 localhost podman[299211]: 2026-02-23 09:47:45.35569792 +0000 UTC m=+0.158712702 container died 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_BRANCH=main, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:47:45 localhost podman[299230]: 2026-02-23 09:47:45.441807031 +0000 UTC m=+0.079063368 container remove 1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_franklin, version=7, name=rhceph, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, GIT_BRANCH=main, release=1770267347) Feb 23 04:47:45 localhost systemd[1]: libpod-conmon-1dc054cb1a3438bf0c04c55c06792238e8ef2bff7ad3cd24e40d72b669ccae13.scope: Deactivated successfully. Feb 23 04:47:46 localhost systemd[1]: tmp-crun.F5REz1.mount: Deactivated successfully. Feb 23 04:47:46 localhost systemd[1]: var-lib-containers-storage-overlay-2620c415444659a5063586f167c4dca73244797973e0676666d4759a58f94e55-merged.mount: Deactivated successfully. Feb 23 04:47:46 localhost ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:47:46 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:47:46 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:46 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:46 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:47:46 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:46 localhost podman[299306]: Feb 23 04:47:46 localhost podman[299306]: 2026-02-23 09:47:46.328655422 +0000 UTC m=+0.080504012 container create a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, build-date=2026-02-09T10:25:24Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:47:46 localhost systemd[1]: Started libpod-conmon-a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a.scope. Feb 23 04:47:46 localhost systemd[1]: Started libcrun container. Feb 23 04:47:46 localhost podman[299306]: 2026-02-23 09:47:46.294172222 +0000 UTC m=+0.046020862 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:46 localhost podman[299306]: 2026-02-23 09:47:46.395843096 +0000 UTC m=+0.147691686 container init a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347) Feb 23 04:47:46 localhost podman[299306]: 2026-02-23 09:47:46.402683865 +0000 UTC m=+0.154532455 container start a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, distribution-scope=public, io.openshift.expose-services=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1770267347, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z) Feb 23 04:47:46 localhost podman[299306]: 2026-02-23 09:47:46.402963953 +0000 UTC m=+0.154812543 container attach a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, vcs-type=git, version=7, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1770267347, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph) Feb 23 04:47:46 localhost agitated_turing[299321]: 167 167 Feb 23 04:47:46 localhost systemd[1]: libpod-a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a.scope: Deactivated successfully. Feb 23 04:47:46 localhost podman[299306]: 2026-02-23 09:47:46.406302565 +0000 UTC m=+0.158151165 container died a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347) Feb 23 04:47:46 localhost podman[299326]: 2026-02-23 09:47:46.542533061 +0000 UTC m=+0.124368106 container remove a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=agitated_turing, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:47:46 localhost systemd[1]: libpod-conmon-a01be3416707ee0e7c83166e6c17b279acd5efb7f809218c20ed4af8a47e446a.scope: Deactivated successfully. Feb 23 04:47:47 localhost systemd[1]: var-lib-containers-storage-overlay-acc7bffc5d9a93190881b278be83339f28a0e53f0b17123632f22af7904d06b8-merged.mount: Deactivated successfully. Feb 23 04:47:47 localhost ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)... Feb 23 04:47:47 localhost ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:47:47 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:47 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:47 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:47 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:47 localhost podman[299400]: Feb 23 04:47:47 localhost podman[299400]: 2026-02-23 09:47:47.40138487 +0000 UTC m=+0.078037416 container create 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, version=7, release=1770267347, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git) Feb 23 04:47:47 localhost systemd[1]: Started libpod-conmon-77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d.scope. Feb 23 04:47:47 localhost systemd[1]: Started libcrun container. Feb 23 04:47:47 localhost podman[299400]: 2026-02-23 09:47:47.468235324 +0000 UTC m=+0.144887860 container init 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_CLEAN=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=) Feb 23 04:47:47 localhost podman[299400]: 2026-02-23 09:47:47.370086227 +0000 UTC m=+0.046738803 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:47 localhost podman[299400]: 2026-02-23 09:47:47.480390604 +0000 UTC m=+0.157043150 container start 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:47:47 localhost podman[299400]: 2026-02-23 09:47:47.480727534 +0000 UTC m=+0.157380120 container attach 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.buildah.version=1.42.2, name=rhceph, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7) Feb 23 04:47:47 localhost eager_bose[299415]: 167 167 Feb 23 04:47:47 localhost systemd[1]: libpod-77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d.scope: Deactivated successfully. Feb 23 04:47:47 localhost podman[299400]: 2026-02-23 09:47:47.483564171 +0000 UTC m=+0.160216697 container died 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, name=rhceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux ) Feb 23 04:47:47 localhost podman[299420]: 2026-02-23 09:47:47.572630731 +0000 UTC m=+0.079829990 container remove 77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_bose, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.buildah.version=1.42.2, release=1770267347, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux ) Feb 23 04:47:47 localhost systemd[1]: libpod-conmon-77687fef8b7826008e7c936abb7c20fd805874aad2aa24527381dfeeb5cf871d.scope: Deactivated successfully. Feb 23 04:47:48 localhost systemd[1]: tmp-crun.wEDGYC.mount: Deactivated successfully. Feb 23 04:47:48 localhost systemd[1]: var-lib-containers-storage-overlay-08631254c1bab725bd764818c5a74919dac759a151daafbc9f661d84e3787234-merged.mount: Deactivated successfully. Feb 23 04:47:48 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:47:48 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:47:48 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:48 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:48 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:48 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:48 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:48 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:47:48 localhost podman[299492]: Feb 23 04:47:48 localhost podman[299492]: 2026-02-23 09:47:48.315725168 +0000 UTC m=+0.077493060 container create 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, version=7, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Feb 23 04:47:48 localhost systemd[1]: Started libpod-conmon-6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634.scope. Feb 23 04:47:48 localhost systemd[1]: Started libcrun container. Feb 23 04:47:48 localhost podman[299492]: 2026-02-23 09:47:48.376739254 +0000 UTC m=+0.138507166 container init 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, name=rhceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1770267347, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, GIT_CLEAN=True) Feb 23 04:47:48 localhost podman[299492]: 2026-02-23 09:47:48.282504267 +0000 UTC m=+0.044272189 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:47:48 localhost podman[299492]: 2026-02-23 09:47:48.386042718 +0000 UTC m=+0.147810610 container start 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, RELEASE=main, name=rhceph, release=1770267347, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Feb 23 04:47:48 localhost podman[299492]: 2026-02-23 09:47:48.386308156 +0000 UTC m=+0.148076048 container attach 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, vcs-type=git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container) Feb 23 04:47:48 localhost cranky_ptolemy[299507]: 167 167 Feb 23 04:47:48 localhost systemd[1]: libpod-6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634.scope: Deactivated successfully. Feb 23 04:47:48 localhost podman[299492]: 2026-02-23 09:47:48.390039739 +0000 UTC m=+0.151807631 container died 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, architecture=x86_64, release=1770267347, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph) Feb 23 04:47:48 localhost podman[299512]: 2026-02-23 09:47:48.484837815 +0000 UTC m=+0.086305308 container remove 6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cranky_ptolemy, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, release=1770267347, distribution-scope=public, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:47:48 localhost systemd[1]: libpod-conmon-6ed5a1bd72ad074ea2b6e421a10f5d6a025a0a347b58bcc25b8c9bee68551634.scope: Deactivated successfully. Feb 23 04:47:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:47:48.549 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:47:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:47:48.550 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:47:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:47:48.551 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:47:49 localhost systemd[1]: var-lib-containers-storage-overlay-86a5054a0f3bf282b2b900f7b905344455fccf51ecc22bf3909309fbe85f6045-merged.mount: Deactivated successfully. Feb 23 04:47:49 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:47:49 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:47:49 localhost ceph-mon[294160]: Deploying daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:47:49 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:49 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:49 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:49 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:50 localhost ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:47:50 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:47:50 localhost nova_compute[282206]: 2026-02-23 09:47:50.278 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:50 localhost nova_compute[282206]: 2026-02-23 09:47:50.280 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:50 localhost nova_compute[282206]: 2026-02-23 09:47:50.281 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:50 localhost nova_compute[282206]: 2026-02-23 09:47:50.281 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:50 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:50 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:50 localhost nova_compute[282206]: 2026-02-23 09:47:50.321 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:50 localhost nova_compute[282206]: 2026-02-23 09:47:50.322 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:50 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:51 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:51 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:51 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:51 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:51 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:51 localhost ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:47:51 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:47:51 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:47:52 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:52 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:52 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:52 localhost ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:47:52 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:47:52 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:47:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:47:52 localhost podman[299529]: 2026-02-23 09:47:52.952068985 +0000 UTC m=+0.115395803 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:47:52 localhost podman[299529]: 2026-02-23 09:47:52.964363009 +0000 UTC m=+0.127689827 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:47:52 localhost systemd[1]: tmp-crun.pg6iNH.mount: Deactivated successfully. Feb 23 04:47:52 localhost podman[299528]: 2026-02-23 09:47:52.979891581 +0000 UTC m=+0.143173988 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:47:52 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:47:53 localhost podman[299528]: 2026-02-23 09:47:53.07412295 +0000 UTC m=+0.237405387 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, managed_by=edpm_ansible) Feb 23 04:47:53 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:47:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:53 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:53 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:54 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:54 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:55 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:47:55 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:47:55 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:55 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:55 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:47:55 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:55 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:47:55 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:47:55 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:55 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:55 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:55 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:47:55 localhost nova_compute[282206]: 2026-02-23 09:47:55.323 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:55 localhost nova_compute[282206]: 2026-02-23 09:47:55.324 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:47:55 localhost nova_compute[282206]: 2026-02-23 09:47:55.325 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:47:55 localhost nova_compute[282206]: 2026-02-23 09:47:55.325 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:55 localhost nova_compute[282206]: 2026-02-23 09:47:55.354 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:47:55 localhost nova_compute[282206]: 2026-02-23 09:47:55.355 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:47:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:47:55 localhost podman[299578]: 2026-02-23 09:47:55.915100444 +0000 UTC m=+0.089174926 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 23 04:47:55 localhost podman[299578]: 2026-02-23 09:47:55.954799532 +0000 UTC m=+0.128873994 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute) Feb 23 04:47:55 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:47:56 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:47:56 localhost ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:47:56 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:47:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:56 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:56 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:47:56 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:57 localhost ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:47:57 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:47:57 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:57 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:57 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:47:58 localhost ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)... Feb 23 04:47:58 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:47:58 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:58 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:47:58 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:58 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:47:58 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:58 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:47:59 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:47:59 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:47:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:47:59 localhost systemd[1]: tmp-crun.7YqP8P.mount: Deactivated successfully. Feb 23 04:47:59 localhost podman[299597]: 2026-02-23 09:47:59.924059325 +0000 UTC m=+0.099442847 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:47:59 localhost podman[299597]: 2026-02-23 09:47:59.932231714 +0000 UTC m=+0.107615156 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:47:59 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:48:00 localhost nova_compute[282206]: 2026-02-23 09:48:00.356 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:00 localhost nova_compute[282206]: 2026-02-23 09:48:00.357 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:00 localhost nova_compute[282206]: 2026-02-23 09:48:00.358 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:00 localhost nova_compute[282206]: 2026-02-23 09:48:00.358 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:00 localhost nova_compute[282206]: 2026-02-23 09:48:00.397 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:00 localhost nova_compute[282206]: 2026-02-23 09:48:00.398 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:00 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:00 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:00 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:00 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:00 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:01 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:01 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:48:01 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:48:01 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:01 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:01 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Feb 23 04:48:01 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2706546237' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Feb 23 04:48:02 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:02 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:02 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:02 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:03 localhost ceph-mon[294160]: Reconfig service osd.default_drive_group Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:03 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:04 localhost podman[299752]: Feb 23 04:48:04 localhost podman[299752]: 2026-02-23 09:48:04.04971057 +0000 UTC m=+0.074750997 container create 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, release=1770267347, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, io.buildah.version=1.42.2) Feb 23 04:48:04 localhost systemd[1]: Started libpod-conmon-7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba.scope. Feb 23 04:48:04 localhost systemd[1]: Started libcrun container. Feb 23 04:48:04 localhost podman[299752]: 2026-02-23 09:48:04.117040279 +0000 UTC m=+0.142080746 container init 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, ceph=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:48:04 localhost podman[299752]: 2026-02-23 09:48:04.021659426 +0000 UTC m=+0.046699933 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:04 localhost podman[299752]: 2026-02-23 09:48:04.178989775 +0000 UTC m=+0.204030232 container start 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, release=1770267347, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, CEPH_POINT_RELEASE=) Feb 23 04:48:04 localhost podman[299752]: 2026-02-23 09:48:04.17980659 +0000 UTC m=+0.204847057 container attach 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.buildah.version=1.42.2, release=1770267347, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:48:04 localhost cool_hertz[299767]: 167 167 Feb 23 04:48:04 localhost systemd[1]: libpod-7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba.scope: Deactivated successfully. Feb 23 04:48:04 localhost podman[299752]: 2026-02-23 09:48:04.183508983 +0000 UTC m=+0.208549460 container died 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.openshift.expose-services=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:48:04 localhost podman[299772]: 2026-02-23 09:48:04.26822414 +0000 UTC m=+0.073460016 container remove 7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_hertz, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_BRANCH=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main) Feb 23 04:48:04 localhost systemd[1]: libpod-conmon-7817e5c1520603d87f75d56ecc4494f4efb6226fa7c751fba29bf46d3ea8bfba.scope: Deactivated successfully. Feb 23 04:48:04 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:48:04 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:48:04 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:04 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:04 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:05 localhost systemd[1]: tmp-crun.HnamUS.mount: Deactivated successfully. Feb 23 04:48:05 localhost systemd[1]: var-lib-containers-storage-overlay-956ac6c11b6a2b90a6335f65b451c5bd114a3c426040e7a4dbad42676f695186-merged.mount: Deactivated successfully. Feb 23 04:48:05 localhost podman[299847]: Feb 23 04:48:05 localhost podman[299847]: 2026-02-23 09:48:05.107270857 +0000 UTC m=+0.072423005 container create e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, release=1770267347, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:48:05 localhost systemd[1]: Started libpod-conmon-e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc.scope. Feb 23 04:48:05 localhost systemd[1]: Started libcrun container. Feb 23 04:48:05 localhost podman[299847]: 2026-02-23 09:48:05.168220791 +0000 UTC m=+0.133372919 container init e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, version=7, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64) Feb 23 04:48:05 localhost podman[299847]: 2026-02-23 09:48:05.076780129 +0000 UTC m=+0.041932317 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:05 localhost podman[299847]: 2026-02-23 09:48:05.176851734 +0000 UTC m=+0.142003882 container start e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, CEPH_POINT_RELEASE=, name=rhceph, maintainer=Guillaume Abrioux , release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:48:05 localhost cool_heisenberg[299862]: 167 167 Feb 23 04:48:05 localhost podman[299847]: 2026-02-23 09:48:05.177259227 +0000 UTC m=+0.142411365 container attach e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, release=1770267347, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main) Feb 23 04:48:05 localhost systemd[1]: libpod-e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc.scope: Deactivated successfully. Feb 23 04:48:05 localhost podman[299847]: 2026-02-23 09:48:05.179691411 +0000 UTC m=+0.144843609 container died e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, distribution-scope=public, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2) Feb 23 04:48:05 localhost podman[299867]: 2026-02-23 09:48:05.27690811 +0000 UTC m=+0.081427170 container remove e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_heisenberg, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, RELEASE=main, architecture=x86_64, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z) Feb 23 04:48:05 localhost systemd[1]: libpod-conmon-e1f31e29d6240dcedc48c89801c390e79a58e879e6b4fa1b7e51e5d3a07992dc.scope: Deactivated successfully. Feb 23 04:48:05 localhost nova_compute[282206]: 2026-02-23 09:48:05.398 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:05 localhost nova_compute[282206]: 2026-02-23 09:48:05.401 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:05 localhost nova_compute[282206]: 2026-02-23 09:48:05.401 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:05 localhost nova_compute[282206]: 2026-02-23 09:48:05.401 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:05 localhost nova_compute[282206]: 2026-02-23 09:48:05.472 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:05 localhost nova_compute[282206]: 2026-02-23 09:48:05.473 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:05 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:05 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:05 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:05 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:48:05 localhost ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:48:05 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:05 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:06 localhost systemd[1]: var-lib-containers-storage-overlay-545d1b1a2a35f22d6d3556b018dd151a0d75ba70343262c5036d92c3644dae9e-merged.mount: Deactivated successfully. Feb 23 04:48:06 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:06 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:06 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:06 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:06 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:48:06 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:48:06 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:06 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:07 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:07 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:07 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:07 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:07 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:48:07 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr fail"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/3009308721' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 e84: 6 total, 6 up, 6 in Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr handle_mgr_map Activating! Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr handle_mgr_map I am now activating Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626461"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626461"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626465.drvnoy"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon).mds e17 all = 0 Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626466.vaywlp"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon).mds e17 all = 0 Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata", "who": "mds.np0005626463.qcthuc"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon).mds e17 all = 0 Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626463.wtksup", "id": "np0005626463.wtksup"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626466.nisqfq", "id": "np0005626466.nisqfq"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626461.lrfquh", "id": "np0005626461.lrfquh"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626460.fyrady", "id": "np0005626460.fyrady"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626460.fyrady", "id": "np0005626460.fyrady"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 0} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 1} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 2} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 3} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 4} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata", "id": 5} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mds metadata"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mds metadata"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon).mds e17 all = 1 Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd metadata"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd metadata"} : dispatch Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata"} : dispatch Feb 23 04:48:07 localhost ceph-mgr[288036]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: balancer Feb 23 04:48:07 localhost ceph-mgr[288036]: [balancer INFO root] Starting Feb 23 04:48:07 localhost ceph-mgr[288036]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: [balancer INFO root] Optimize plan auto_2026-02-23_09:48:07 Feb 23 04:48:07 localhost ceph-mgr[288036]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Feb 23 04:48:07 localhost ceph-mgr[288036]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later Feb 23 04:48:07 localhost ceph-mgr[288036]: [cephadm WARNING root] removing stray HostCache host record np0005626460.localdomain.devices.0 Feb 23 04:48:07 localhost ceph-mgr[288036]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005626460.localdomain.devices.0 Feb 23 04:48:07 localhost systemd[1]: session-68.scope: Deactivated successfully. Feb 23 04:48:07 localhost systemd[1]: session-68.scope: Consumed 20.524s CPU time. Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch Feb 23 04:48:07 localhost systemd-logind[759]: Session 68 logged out. Waiting for processes to exit. Feb 23 04:48:07 localhost systemd-logind[759]: Removed session 68. Feb 23 04:48:07 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} v 0) Feb 23 04:48:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: cephadm Feb 23 04:48:07 localhost ceph-mgr[288036]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: crash Feb 23 04:48:07 localhost ceph-mgr[288036]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: devicehealth Feb 23 04:48:07 localhost ceph-mgr[288036]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: iostat Feb 23 04:48:07 localhost ceph-mgr[288036]: [devicehealth INFO root] Starting Feb 23 04:48:07 localhost ceph-mgr[288036]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: nfs Feb 23 04:48:07 localhost ceph-mgr[288036]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: orchestrator Feb 23 04:48:07 localhost ceph-mgr[288036]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: pg_autoscaler Feb 23 04:48:07 localhost ceph-mgr[288036]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:07 localhost ceph-mgr[288036]: mgr load Constructed class from module: progress Feb 23 04:48:07 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] _maybe_adjust Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:08 localhost ceph-mgr[288036]: [progress INFO root] Loading... Feb 23 04:48:08 localhost ceph-mgr[288036]: [progress INFO root] Loaded [, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ] historic events Feb 23 04:48:08 localhost ceph-mgr[288036]: [progress INFO root] Loaded OSDMap, ready. Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] recovery thread starting Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] starting setup Feb 23 04:48:08 localhost ceph-mgr[288036]: mgr load Constructed class from module: rbd_support Feb 23 04:48:08 localhost ceph-mgr[288036]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:08 localhost ceph-mgr[288036]: mgr load Constructed class from module: restful Feb 23 04:48:08 localhost ceph-mgr[288036]: [restful INFO root] server_addr: :: server_port: 8003 Feb 23 04:48:08 localhost ceph-mgr[288036]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:08 localhost ceph-mgr[288036]: mgr load Constructed class from module: status Feb 23 04:48:08 localhost ceph-mgr[288036]: [restful WARNING root] server not running: no certificate configured Feb 23 04:48:08 localhost ceph-mgr[288036]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:08 localhost ceph-mgr[288036]: mgr load Constructed class from module: telemetry Feb 23 04:48:08 localhost ceph-mgr[288036]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5) Feb 23 04:48:08 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/mirror_snapshot_schedule"} v 0) Feb 23 04:48:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/mirror_snapshot_schedule"} : dispatch Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: images, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] PerfHandler: starting Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_task_task: vms, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Feb 23 04:48:08 localhost ceph-mgr[288036]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Feb 23 04:48:08 localhost ceph-mgr[288036]: mgr load Constructed class from module: volumes Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_task_task: volumes, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_task_task: images, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_task_task: backups, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] TaskHandler: starting Feb 23 04:48:08 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/trash_purge_schedule"} v 0) Feb 23 04:48:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/trash_purge_schedule"} : dispatch Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.079+0000 7f43f870a640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:08.082+0000 7f43f4702640 -1 client.0 error registering admin socket command: (17) File exists Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: images, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting Feb 23 04:48:08 localhost ceph-mgr[288036]: [rbd_support INFO root] setup complete Feb 23 04:48:08 localhost sshd[300031]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:48:08 localhost systemd-logind[759]: New session 69 of user ceph-admin. Feb 23 04:48:08 localhost systemd[1]: Started Session 69 of User ceph-admin. Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26597 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26597 172.18.0.107:0/4195960694' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='client.? 172.18.0.200:0/3009308721' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: Activating manager daemon np0005626463.wtksup Feb 23 04:48:08 localhost ceph-mon[294160]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 23 04:48:08 localhost ceph-mon[294160]: Manager daemon np0005626463.wtksup is now available Feb 23 04:48:08 localhost ceph-mon[294160]: removing stray HostCache host record np0005626460.localdomain.devices.0 Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"}]': finished Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626460.localdomain.devices.0"}]': finished Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/mirror_snapshot_schedule"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/mirror_snapshot_schedule"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/trash_purge_schedule"} : dispatch Feb 23 04:48:08 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626463.wtksup/trash_purge_schedule"} : dispatch Feb 23 04:48:08 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:08 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:08 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:09 localhost podman[300142]: 2026-02-23 09:48:09.282972863 +0000 UTC m=+0.118534229 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vcs-type=git, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, ceph=True, io.buildah.version=1.42.2, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container) Feb 23 04:48:09 localhost podman[242954]: time="2026-02-23T09:48:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:48:09 localhost podman[300142]: 2026-02-23 09:48:09.445797798 +0000 UTC m=+0.281359214 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, release=1770267347, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, RELEASE=main, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, ceph=True) Feb 23 04:48:09 localhost podman[242954]: @ - - [23/Feb/2026:09:48:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:48:09 localhost ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Bus STARTING Feb 23 04:48:09 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Bus STARTING Feb 23 04:48:09 localhost podman[242954]: @ - - [23/Feb/2026:09:48:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18267 "" "Go-http-client/1.1" Feb 23 04:48:09 localhost ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Serving on http://172.18.0.106:8765 Feb 23 04:48:09 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Serving on http://172.18.0.106:8765 Feb 23 04:48:09 localhost ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Serving on https://172.18.0.106:7150 Feb 23 04:48:09 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Serving on https://172.18.0.106:7150 Feb 23 04:48:09 localhost ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Bus STARTED Feb 23 04:48:09 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Bus STARTED Feb 23 04:48:09 localhost ceph-mgr[288036]: [cephadm INFO cherrypy.error] [23/Feb/2026:09:48:09] ENGINE Client ('172.18.0.106', 50730) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 23 04:48:09 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : [23/Feb/2026:09:48:09] ENGINE Client ('172.18.0.106', 50730) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 23 04:48:09 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:09 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:09 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:09 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:09 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:09 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:09 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:48:10 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0) Feb 23 04:48:10 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0) Feb 23 04:48:10 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:10 localhost podman[300273]: 2026-02-23 09:48:10.090341885 +0000 UTC m=+0.088519155 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:48:10 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:10 localhost ceph-mgr[288036]: [devicehealth INFO root] Check health Feb 23 04:48:10 localhost podman[300273]: 2026-02-23 09:48:10.176166907 +0000 UTC m=+0.174344167 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:48:10 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:10 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:48:10 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:10 localhost nova_compute[282206]: 2026-02-23 09:48:10.473 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:10 localhost nova_compute[282206]: 2026-02-23 09:48:10.475 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:10 localhost nova_compute[282206]: 2026-02-23 09:48:10.475 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:10 localhost nova_compute[282206]: 2026-02-23 09:48:10.475 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:10 localhost nova_compute[282206]: 2026-02-23 09:48:10.543 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:10 localhost nova_compute[282206]: 2026-02-23 09:48:10.544 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:10 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:10 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:10 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:10 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:10 localhost ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Bus STARTING Feb 23 04:48:10 localhost ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Serving on http://172.18.0.106:8765 Feb 23 04:48:10 localhost ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Serving on https://172.18.0.106:7150 Feb 23 04:48:10 localhost ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Bus STARTED Feb 23 04:48:10 localhost ceph-mon[294160]: [23/Feb/2026:09:48:09] ENGINE Client ('172.18.0.106', 50730) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 23 04:48:10 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:10 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:10 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:10 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:10 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:10 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:10 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:10 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:10 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:11 localhost sshd[300428]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm INFO root] Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm INFO root] Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm INFO root] Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:11 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:11 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:11 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:11 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:11 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:11 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd/host:np0005626461", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:48:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:12 localhost sshd[300595]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:48:12 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:12 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:12 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:12 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:12 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:12 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:12 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:12 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:12 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mgr.np0005626465.hlpkwo 172.18.0.107:0/3454997775; not ready for session (expect reconnect) Feb 23 04:48:12 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} v 0) Feb 23 04:48:12 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr metadata", "who": "np0005626465.hlpkwo", "id": "np0005626465.hlpkwo"} : dispatch Feb 23 04:48:12 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:12 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:12 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:12 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:13 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 04:48:13 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 04:48:13 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 04:48:13 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:48:13 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 04:48:13 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:48:13 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:13 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:13 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:13 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:13 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:13 localhost nova_compute[282206]: 2026-02-23 09:48:13.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:13 localhost nova_compute[282206]: 2026-02-23 09:48:13.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:48:13 localhost nova_compute[282206]: 2026-02-23 09:48:13.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:13 localhost openstack_network_exporter[245358]: ERROR 09:48:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:48:13 localhost openstack_network_exporter[245358]: Feb 23 04:48:13 localhost openstack_network_exporter[245358]: ERROR 09:48:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:48:13 localhost openstack_network_exporter[245358]: Feb 23 04:48:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:13 localhost nova_compute[282206]: 2026-02-23 09:48:13.783 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:48:13 localhost nova_compute[282206]: 2026-02-23 09:48:13.783 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:48:13 localhost nova_compute[282206]: 2026-02-23 09:48:13.784 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:48:13 localhost nova_compute[282206]: 2026-02-23 09:48:13.784 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:48:13 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:13 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:13 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:13 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:13 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:48:14 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:14 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:14 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:14 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:14 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:14 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:14 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:14 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:48:14 localhost nova_compute[282206]: 2026-02-23 09:48:14.139 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:14 localhost podman[301074]: 2026-02-23 09:48:14.176005792 +0000 UTC m=+0.098759058 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.7, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=) Feb 23 04:48:14 localhost nova_compute[282206]: 2026-02-23 09:48:14.177 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:48:14 localhost nova_compute[282206]: 2026-02-23 09:48:14.178 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:48:14 localhost nova_compute[282206]: 2026-02-23 09:48:14.178 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:14 localhost podman[301074]: 2026-02-23 09:48:14.191357178 +0000 UTC m=+0.114110484 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, build-date=2026-02-05T04:57:10Z, release=1770267347, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=) Feb 23 04:48:14 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:48:14 localhost ceph-mgr[288036]: [progress INFO root] update: starting ev c17efa0e-f80d-41ba-9f40-152245f118b1 (Updating node-proxy deployment (+4 -> 4)) Feb 23 04:48:14 localhost ceph-mgr[288036]: [progress INFO root] complete: finished ev c17efa0e-f80d-41ba-9f40-152245f118b1 (Updating node-proxy deployment (+4 -> 4)) Feb 23 04:48:14 localhost ceph-mgr[288036]: [progress INFO root] Completed event c17efa0e-f80d-41ba-9f40-152245f118b1 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:14 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:48:14 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:48:14 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:14 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:14 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:14 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:15 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:15 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:15 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:15 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:15 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:15 localhost nova_compute[282206]: 2026-02-23 09:48:15.544 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:15 localhost nova_compute[282206]: 2026-02-23 09:48:15.547 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:15 localhost nova_compute[282206]: 2026-02-23 09:48:15.547 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:15 localhost nova_compute[282206]: 2026-02-23 09:48:15.547 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:15 localhost nova_compute[282206]: 2026-02-23 09:48:15.575 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:15 localhost nova_compute[282206]: 2026-02-23 09:48:15.576 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:15 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:15 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:15 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Feb 23 04:48:15 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:48:15 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:15 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:15 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 0 B/s wr, 16 op/s Feb 23 04:48:15 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:48:15 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:48:16 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:48:16 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:16 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:16 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:16 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:16 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:48:16 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:16 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:16 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:16 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:16 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 23 04:48:16 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:48:16 localhost ceph-mgr[288036]: [progress INFO root] update: starting ev a50cf599-777b-4364-bdbe-a2e0047bc2d0 (Updating node-proxy deployment (+4 -> 4)) Feb 23 04:48:16 localhost ceph-mgr[288036]: [progress INFO root] complete: finished ev a50cf599-777b-4364-bdbe-a2e0047bc2d0 (Updating node-proxy deployment (+4 -> 4)) Feb 23 04:48:16 localhost ceph-mgr[288036]: [progress INFO root] Completed event a50cf599-777b-4364-bdbe-a2e0047bc2d0 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Feb 23 04:48:16 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 23 04:48:16 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 23 04:48:17 localhost nova_compute[282206]: 2026-02-23 09:48:17.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:17 localhost nova_compute[282206]: 2026-02-23 09:48:17.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:17 localhost nova_compute[282206]: 2026-02-23 09:48:17.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:17 localhost nova_compute[282206]: 2026-02-23 09:48:17.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:48:17 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:17 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:17 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:17 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:17 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:17 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:17 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:17 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:17 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:17 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:17 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:17 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s Feb 23 04:48:18 localhost ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events Feb 23 04:48:18 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.081 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.082 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.082 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.083 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.083 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:48:18 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:18 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:48:18 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1681775718' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.542 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.698 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.699 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:48:18 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:18 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:18 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:18 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.930 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.932 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11723MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.932 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:48:18 localhost nova_compute[282206]: 2026-02-23 09:48:18.933 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.002 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.002 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.003 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.045 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:48:19 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:19 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:48:19 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2580160333' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.487 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.494 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.541 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.544 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:48:19 localhost nova_compute[282206]: 2026-02-23 09:48:19.544 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:48:19 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34459 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 23 04:48:19 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:19 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:19 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:19 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:19 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Feb 23 04:48:20 localhost nova_compute[282206]: 2026-02-23 09:48:20.541 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:20 localhost nova_compute[282206]: 2026-02-23 09:48:20.542 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:48:20 localhost nova_compute[282206]: 2026-02-23 09:48:20.576 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:20 localhost nova_compute[282206]: 2026-02-23 09:48:20.578 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:20 localhost nova_compute[282206]: 2026-02-23 09:48:20.579 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:20 localhost nova_compute[282206]: 2026-02-23 09:48:20.579 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:20 localhost nova_compute[282206]: 2026-02-23 09:48:20.617 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:20 localhost nova_compute[282206]: 2026-02-23 09:48:20.618 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:20 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:20 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:20 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:21 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:21 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:21 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:21 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:21 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:21 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:21 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Feb 23 04:48:22 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.27135 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:48:22 localhost ceph-mgr[288036]: [cephadm INFO root] Saving service mon spec with placement label:mon Feb 23 04:48:22 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 23 04:48:22 localhost ceph-mgr[288036]: [progress INFO root] update: starting ev 2b72b9b3-71ae-42e5-a76a-18df583f9922 (Updating node-proxy deployment (+4 -> 4)) Feb 23 04:48:22 localhost ceph-mgr[288036]: [progress INFO root] complete: finished ev 2b72b9b3-71ae-42e5-a76a-18df583f9922 (Updating node-proxy deployment (+4 -> 4)) Feb 23 04:48:22 localhost ceph-mgr[288036]: [progress INFO root] Completed event 2b72b9b3-71ae-42e5-a76a-18df583f9922 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 23 04:48:22 localhost ceph-mon[294160]: Saving service mon spec with placement label:mon Feb 23 04:48:22 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:22 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:22 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:22 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:22 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626461 (monmap changed)... Feb 23 04:48:22 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626461 (monmap changed)... Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:22 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain Feb 23 04:48:22 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain Feb 23 04:48:22 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:22 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:22 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:22 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:23 localhost ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:23 localhost ceph-mon[294160]: Reconfiguring mon.np0005626461 (monmap changed)... Feb 23 04:48:23 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:23 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626461 on np0005626461.localdomain Feb 23 04:48:23 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:23 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34471 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626465", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0) Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0) Feb 23 04:48:23 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:48:23 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 23 04:48:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 23 04:48:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:23 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:48:23 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:48:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:48:23 localhost podman[301228]: 2026-02-23 09:48:23.640440531 +0000 UTC m=+0.085888305 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible) Feb 23 04:48:23 localhost podman[301229]: 2026-02-23 09:48:23.720998123 +0000 UTC m=+0.163152356 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:48:23 localhost podman[301229]: 2026-02-23 09:48:23.736213625 +0000 UTC m=+0.178367848 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:48:23 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:48:23 localhost podman[301228]: 2026-02-23 09:48:23.752571803 +0000 UTC m=+0.198019647 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:48:23 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:48:23 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:23 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:23 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:23 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Feb 23 04:48:24 localhost podman[301312]: Feb 23 04:48:24 localhost podman[301312]: 2026-02-23 09:48:24.065184328 +0000 UTC m=+0.089544306 container create d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=) Feb 23 04:48:24 localhost systemd[1]: Started libpod-conmon-d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200.scope. Feb 23 04:48:24 localhost systemd[1]: Started libcrun container. Feb 23 04:48:24 localhost podman[301312]: 2026-02-23 09:48:24.030712629 +0000 UTC m=+0.055072587 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:24 localhost podman[301312]: 2026-02-23 09:48:24.129503376 +0000 UTC m=+0.153863294 container init d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, release=1770267347, name=rhceph, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:48:24 localhost podman[301312]: 2026-02-23 09:48:24.139343425 +0000 UTC m=+0.163703383 container start d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, architecture=x86_64, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, build-date=2026-02-09T10:25:24Z) Feb 23 04:48:24 localhost podman[301312]: 2026-02-23 09:48:24.139748417 +0000 UTC m=+0.164108355 container attach d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, build-date=2026-02-09T10:25:24Z, ceph=True, release=1770267347, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, distribution-scope=public, version=7, maintainer=Guillaume Abrioux ) Feb 23 04:48:24 localhost hungry_ritchie[301327]: 167 167 Feb 23 04:48:24 localhost systemd[1]: libpod-d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200.scope: Deactivated successfully. Feb 23 04:48:24 localhost podman[301312]: 2026-02-23 09:48:24.143309156 +0000 UTC m=+0.167669074 container died d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, maintainer=Guillaume Abrioux , RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=) Feb 23 04:48:24 localhost podman[301332]: 2026-02-23 09:48:24.242661799 +0000 UTC m=+0.086058450 container remove d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_ritchie, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Feb 23 04:48:24 localhost systemd[1]: libpod-conmon-d58772fcbbec26abcccbf92423fef53f1a9b130fda70dd0632f49a6d53ce9200.scope: Deactivated successfully. Feb 23 04:48:24 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:24 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:24 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:48:24 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:48:24 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 23 04:48:24 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:24 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 23 04:48:24 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 23 04:48:24 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:24 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:24 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:48:24 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:48:24 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:24 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:24 localhost ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:48:24 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:24 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:48:24 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:24 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:24 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.458376) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104458429, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2818, "num_deletes": 256, "total_data_size": 8432089, "memory_usage": 8912864, "flush_reason": "Manual Compaction"} Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104496724, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 5016845, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14319, "largest_seqno": 17132, "table_properties": {"data_size": 5004990, "index_size": 7400, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 30199, "raw_average_key_size": 22, "raw_value_size": 4979249, "raw_average_value_size": 3732, "num_data_blocks": 322, "num_entries": 1334, "num_filter_entries": 1334, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840042, "oldest_key_time": 1771840042, "file_creation_time": 1771840104, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 38415 microseconds, and 10536 cpu microseconds. Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.496787) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 5016845 bytes OK Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.496816) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.499651) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.499679) EVENT_LOG_v1 {"time_micros": 1771840104499672, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.499699) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8418371, prev total WAL file size 8443858, number of live WAL files 2. Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.501268) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end) Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4899KB)], [24(14MB)] Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104501324, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 20670629, "oldest_snapshot_seqno": -1} Feb 23 04:48:24 localhost systemd[1]: var-lib-containers-storage-overlay-a2f285afc8e7edd2b875030538209a94f5477e72cdfff44d1bb335ee3fea9dcf-merged.mount: Deactivated successfully. Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11131 keys, 18458382 bytes, temperature: kUnknown Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104660494, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 18458382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18392184, "index_size": 37297, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 298013, "raw_average_key_size": 26, "raw_value_size": 18199312, "raw_average_value_size": 1635, "num_data_blocks": 1436, "num_entries": 11131, "num_filter_entries": 11131, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840104, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.660858) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 18458382 bytes Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.662724) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.8 rd, 115.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 14.9 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(7.8) write-amplify(3.7) OK, records in: 11680, records dropped: 549 output_compression: NoCompression Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.662754) EVENT_LOG_v1 {"time_micros": 1771840104662741, "job": 12, "event": "compaction_finished", "compaction_time_micros": 159264, "compaction_time_cpu_micros": 53154, "output_level": 6, "num_output_files": 1, "total_output_size": 18458382, "num_input_records": 11680, "num_output_records": 11131, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104663641, "job": 12, "event": "table_file_deletion", "file_number": 26} Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840104666292, "job": 12, "event": "table_file_deletion", "file_number": 24} Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.501189) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666418) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666421) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:24.666424) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:24 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:24 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:24 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:24 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:25 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:25 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:25 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:25 localhost ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:48:25 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:48:25 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:25 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:25 localhost nova_compute[282206]: 2026-02-23 09:48:25.659 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:25 localhost nova_compute[282206]: 2026-02-23 09:48:25.661 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:25 localhost nova_compute[282206]: 2026-02-23 09:48:25.661 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5043 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:25 localhost nova_compute[282206]: 2026-02-23 09:48:25.662 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:25 localhost nova_compute[282206]: 2026-02-23 09:48:25.662 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:25 localhost nova_compute[282206]: 2026-02-23 09:48:25.665 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:25 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:25 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:25 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:25 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s Feb 23 04:48:26 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:26 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:26 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:26 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:48:26 localhost systemd[1]: tmp-crun.9j97Ex.mount: Deactivated successfully. Feb 23 04:48:26 localhost podman[301349]: 2026-02-23 09:48:26.953501694 +0000 UTC m=+0.084300677 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:48:26 localhost podman[301349]: 2026-02-23 09:48:26.989382756 +0000 UTC m=+0.120181709 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:48:27 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:48:27 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:27 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:27 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:27 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:27 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:27 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:27 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:28 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:28 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:28 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:29 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:29 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:29 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:29 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:29 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:30 localhost nova_compute[282206]: 2026-02-23 09:48:30.665 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:30 localhost nova_compute[282206]: 2026-02-23 09:48:30.668 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:30 localhost nova_compute[282206]: 2026-02-23 09:48:30.668 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:30 localhost nova_compute[282206]: 2026-02-23 09:48:30.668 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:30 localhost nova_compute[282206]: 2026-02-23 09:48:30.698 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:30 localhost nova_compute[282206]: 2026-02-23 09:48:30.699 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:30 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:30 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:30 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:30 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:48:30 localhost systemd[1]: tmp-crun.CWCxBC.mount: Deactivated successfully. Feb 23 04:48:30 localhost podman[301368]: 2026-02-23 09:48:30.915980661 +0000 UTC m=+0.097075644 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:48:30 localhost podman[301368]: 2026-02-23 09:48:30.922077038 +0000 UTC m=+0.103171981 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:48:30 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:48:31 localhost ceph-mon[294160]: mon.np0005626463@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:31 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:31 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:31 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:31 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:31 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:31 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:32 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:32 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:32 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:33 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:33 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:33 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:33 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:33 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:33 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:34 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44399 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626461", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 23 04:48:34 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:34 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:34 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:34 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:35 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.27159 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005626461"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:48:35 localhost ceph-mgr[288036]: [cephadm INFO root] Remove daemons mon.np0005626461 Feb 23 04:48:35 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005626461 Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "quorum_status"} v 0) Feb 23 04:48:35 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "quorum_status"} : dispatch Feb 23 04:48:35 localhost ceph-mgr[288036]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005626461: new quorum should be ['np0005626466', 'np0005626463'] (from ['np0005626466', 'np0005626463']) Feb 23 04:48:35 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005626461: new quorum should be ['np0005626466', 'np0005626463'] (from ['np0005626466', 'np0005626463']) Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e11 handle_command mon_command({"prefix": "mon rm", "name": "np0005626461"} v 0) Feb 23 04:48:35 localhost ceph-mgr[288036]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005626461 from monmap... Feb 23 04:48:35 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon rm", "name": "np0005626461"} : dispatch Feb 23 04:48:35 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removing monitor np0005626461 from monmap... Feb 23 04:48:35 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005626461 from np0005626461.localdomain -- ports [] Feb 23 04:48:35 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005626461 from np0005626461.localdomain -- ports [] Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@2(peon) e12 my rank is now 1 (was 2) Feb 23 04:48:35 localhost ceph-mgr[288036]: client.44327 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:48:35 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:48:35 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:48:35 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:48:35 localhost ceph-mon[294160]: paxos.1).electionLogic(50) init, last seen epoch 50 Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0) Feb 23 04:48:35 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:48:35 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:48:35 localhost ceph-mgr[288036]: client.27096 ms_handle_reset on v2:172.18.0.105:3300/0 Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:35 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:35 localhost nova_compute[282206]: 2026-02-23 09:48:35.699 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:35 localhost nova_compute[282206]: 2026-02-23 09:48:35.701 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:35 localhost nova_compute[282206]: 2026-02-23 09:48:35.701 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:35 localhost nova_compute[282206]: 2026-02-23 09:48:35.701 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:35 localhost nova_compute[282206]: 2026-02-23 09:48:35.738 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:35 localhost nova_compute[282206]: 2026-02-23 09:48:35.739 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:35 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:35 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:35 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:35 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:35 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:36 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:36 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:36 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:36 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 23 04:48:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:37 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:37 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:37 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:37 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:37 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:37 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:37 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:37 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:37 localhost ceph-mon[294160]: Remove daemons mon.np0005626461 Feb 23 04:48:37 localhost ceph-mon[294160]: Safe to remove mon.np0005626461: new quorum should be ['np0005626466', 'np0005626463'] (from ['np0005626466', 'np0005626463']) Feb 23 04:48:37 localhost ceph-mon[294160]: Removing monitor np0005626461 from monmap... Feb 23 04:48:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon rm", "name": "np0005626461"} : dispatch Feb 23 04:48:37 localhost ceph-mon[294160]: Removing daemon mon.np0005626461 from np0005626461.localdomain -- ports [] Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626466 is new leader, mons np0005626466,np0005626463 in quorum (ranks 0,1) Feb 23 04:48:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:48:37 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:48:37 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:37 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:37 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:37 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:37 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:48:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [] Feb 23 04:48:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:48:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [] Feb 23 04:48:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:48:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [] Feb 23 04:48:38 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:38 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:38 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:38 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626461.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:48:38 localhost ceph-mon[294160]: Updating np0005626461.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:38 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:38 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:38 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:48:38 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:38 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:38 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:38 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (2) No such file or directory Feb 23 04:48:38 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44407 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626461.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:48:38 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 23 04:48:38 localhost ceph-mgr[288036]: [cephadm INFO root] Removed label mon from host np0005626461.localdomain Feb 23 04:48:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removed label mon from host np0005626461.localdomain Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:48:39 localhost ceph-mgr[288036]: [progress INFO root] update: starting ev 5660bf6e-c23f-4dad-b2ec-cb5c00c6efcb (Updating node-proxy deployment (+4 -> 4)) Feb 23 04:48:39 localhost ceph-mgr[288036]: [progress INFO root] complete: finished ev 5660bf6e-c23f-4dad-b2ec-cb5c00c6efcb (Updating node-proxy deployment (+4 -> 4)) Feb 23 04:48:39 localhost ceph-mgr[288036]: [progress INFO root] Completed event 5660bf6e-c23f-4dad-b2ec-cb5c00c6efcb (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e12 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(probing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(probing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:39 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:48:39 localhost ceph-mon[294160]: paxos.1).electionLogic(52) init, last seen epoch 52 Feb 23 04:48:39 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:48:39 localhost podman[242954]: time="2026-02-23T09:48:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:48:39 localhost podman[242954]: @ - - [23/Feb/2026:09:48:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:48:39 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:48:39 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:39 localhost podman[242954]: @ - - [23/Feb/2026:09:48:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18279 "" "Go-http-client/1.1" Feb 23 04:48:39 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:39 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:39 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:39 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument Feb 23 04:48:39 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:40 localhost nova_compute[282206]: 2026-02-23 09:48:40.740 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:40 localhost nova_compute[282206]: 2026-02-23 09:48:40.742 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:40 localhost nova_compute[282206]: 2026-02-23 09:48:40.742 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:40 localhost nova_compute[282206]: 2026-02-23 09:48:40.742 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:40 localhost nova_compute[282206]: 2026-02-23 09:48:40.775 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:40 localhost nova_compute[282206]: 2026-02-23 09:48:40.776 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:40 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:40 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:40 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:40 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument Feb 23 04:48:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:48:40 localhost podman[301725]: 2026-02-23 09:48:40.903138739 +0000 UTC m=+0.064809703 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:48:40 localhost podman[301725]: 2026-02-23 09:48:40.916512726 +0000 UTC m=+0.078183720 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:48:40 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:48:41 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:41 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:41 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:41 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument Feb 23 04:48:41 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:42 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:42 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:42 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:42 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument Feb 23 04:48:43 localhost ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events Feb 23 04:48:43 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:48:43 localhost openstack_network_exporter[245358]: ERROR 09:48:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:48:43 localhost openstack_network_exporter[245358]: Feb 23 04:48:43 localhost openstack_network_exporter[245358]: ERROR 09:48:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:48:43 localhost openstack_network_exporter[245358]: Feb 23 04:48:43 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:43 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:43 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:43 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626465: (22) Invalid argument Feb 23 04:48:43 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:44 localhost ceph-mon[294160]: paxos.1).electionLogic(53) init, last seen epoch 53, mid-election, bumping Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626463@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 23 04:48:44 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:44 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:44 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:48:44 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:48:44 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626461.lrfquh (monmap changed)... Feb 23 04:48:44 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626465 calling monitor election Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626466 is new leader, mons np0005626466,np0005626463,np0005626465 in quorum (ranks 0,1,2) Feb 23 04:48:44 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626461.lrfquh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:44 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:48:44 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:44 localhost sshd[301749]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:48:44 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44410 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626461.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 23 04:48:44 localhost ceph-mgr[288036]: [cephadm INFO root] Removed label mgr from host np0005626461.localdomain Feb 23 04:48:44 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005626461.localdomain Feb 23 04:48:44 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626465 172.18.0.107:0/438899899; not ready for session (expect reconnect) Feb 23 04:48:44 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:48:44 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:48:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:48:44 localhost podman[301751]: 2026-02-23 09:48:44.904537702 +0000 UTC m=+0.080590004 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, managed_by=edpm_ansible, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter) Feb 23 04:48:44 localhost podman[301751]: 2026-02-23 09:48:44.919553649 +0000 UTC m=+0.095605901 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, name=ubi9/ubi-minimal, distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc.) Feb 23 04:48:44 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:48:45 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0) Feb 23 04:48:45 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0) Feb 23 04:48:45 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:48:45 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:48:45 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:48:45 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:45 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:45 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:45 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:48:45 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:48:45 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626461.lrfquh on np0005626461.localdomain Feb 23 04:48:45 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:45 localhost ceph-mon[294160]: Removed label mgr from host np0005626461.localdomain Feb 23 04:48:45 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:45 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:45 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:45 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626461", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:45 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34494 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005626461.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:48:45 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 23 04:48:45 localhost nova_compute[282206]: 2026-02-23 09:48:45.777 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:45 localhost nova_compute[282206]: 2026-02-23 09:48:45.779 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:45 localhost nova_compute[282206]: 2026-02-23 09:48:45.779 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:45 localhost nova_compute[282206]: 2026-02-23 09:48:45.780 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:45 localhost ceph-mgr[288036]: mgr.server handle_report got status from non-daemon mon.np0005626465 Feb 23 04:48:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:48:45.798+0000 7f44258e4640 -1 mgr.server handle_report got status from non-daemon mon.np0005626465 Feb 23 04:48:45 localhost nova_compute[282206]: 2026-02-23 09:48:45.819 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:45 localhost nova_compute[282206]: 2026-02-23 09:48:45.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:45 localhost ceph-mgr[288036]: [cephadm INFO root] Removed label _admin from host np0005626461.localdomain Feb 23 04:48:45 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005626461.localdomain Feb 23 04:48:45 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:46 localhost ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:46 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain.devices.0}] v 0) Feb 23 04:48:46 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626461.localdomain}] v 0) Feb 23 04:48:46 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:48:46 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:48:46 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:48:46 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:46 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:46 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:46 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:48:46 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.566269) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126566315, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 776, "num_deletes": 251, "total_data_size": 834042, "memory_usage": 850072, "flush_reason": "Manual Compaction"} Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126572713, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 525456, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17137, "largest_seqno": 17908, "table_properties": {"data_size": 521724, "index_size": 1459, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9626, "raw_average_key_size": 19, "raw_value_size": 513473, "raw_average_value_size": 1063, "num_data_blocks": 58, "num_entries": 483, "num_filter_entries": 483, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840104, "oldest_key_time": 1771840104, "file_creation_time": 1771840126, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 6482 microseconds, and 2569 cpu microseconds. Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.572750) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 525456 bytes OK Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.572772) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.574652) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.574672) EVENT_LOG_v1 {"time_micros": 1771840126574667, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.574693) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 829715, prev total WAL file size 829715, number of live WAL files 2. Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.575316) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323733' seq:72057594037927935, type:22 .. '6B760031353235' seq:0, type:0; will stop at (end) Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(513KB)], [27(17MB)] Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126575353, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18983838, "oldest_snapshot_seqno": -1} Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11078 keys, 17918100 bytes, temperature: kUnknown Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126666190, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 17918100, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17853249, "index_size": 36078, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27717, "raw_key_size": 298828, "raw_average_key_size": 26, "raw_value_size": 17662182, "raw_average_value_size": 1594, "num_data_blocks": 1367, "num_entries": 11078, "num_filter_entries": 11078, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840126, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.666500) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 17918100 bytes Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.668373) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 208.8 rd, 197.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 17.6 +0.0 blob) out(17.1 +0.0 blob), read-write-amplify(70.2) write-amplify(34.1) OK, records in: 11614, records dropped: 536 output_compression: NoCompression Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.668405) EVENT_LOG_v1 {"time_micros": 1771840126668390, "job": 14, "event": "compaction_finished", "compaction_time_micros": 90927, "compaction_time_cpu_micros": 35993, "output_level": 6, "num_output_files": 1, "total_output_size": 17918100, "num_input_records": 11614, "num_output_records": 11078, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126668926, "job": 14, "event": "table_file_deletion", "file_number": 29} Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840126671588, "job": 14, "event": "table_file_deletion", "file_number": 27} Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.575218) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:46 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:48:46.671670) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:48:46 localhost podman[301824]: Feb 23 04:48:46 localhost podman[301824]: 2026-02-23 09:48:46.819544695 +0000 UTC m=+0.072197267 container create 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, RELEASE=main, release=1770267347, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, name=rhceph, version=7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True) Feb 23 04:48:46 localhost ceph-mon[294160]: Reconfiguring crash.np0005626461 (monmap changed)... Feb 23 04:48:46 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626461 on np0005626461.localdomain Feb 23 04:48:46 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:46 localhost ceph-mon[294160]: Removed label _admin from host np0005626461.localdomain Feb 23 04:48:46 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:46 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:46 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:46 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:46 localhost systemd[1]: Started libpod-conmon-9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06.scope. Feb 23 04:48:46 localhost systemd[1]: Started libcrun container. Feb 23 04:48:46 localhost podman[301824]: 2026-02-23 09:48:46.790631355 +0000 UTC m=+0.043283957 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:46 localhost podman[301824]: 2026-02-23 09:48:46.899352855 +0000 UTC m=+0.152005437 container init 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, release=1770267347, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7) Feb 23 04:48:46 localhost podman[301824]: 2026-02-23 09:48:46.910174764 +0000 UTC m=+0.162827336 container start 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Feb 23 04:48:46 localhost podman[301824]: 2026-02-23 09:48:46.91071504 +0000 UTC m=+0.163367672 container attach 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, distribution-scope=public, release=1770267347, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.openshift.expose-services=, ceph=True, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:48:46 localhost friendly_benz[301841]: 167 167 Feb 23 04:48:46 localhost systemd[1]: libpod-9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06.scope: Deactivated successfully. Feb 23 04:48:46 localhost podman[301824]: 2026-02-23 09:48:46.912765353 +0000 UTC m=+0.165417965 container died 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, ceph=True, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph) Feb 23 04:48:47 localhost podman[301846]: 2026-02-23 09:48:47.046946907 +0000 UTC m=+0.121620993 container remove 9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_benz, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7) Feb 23 04:48:47 localhost systemd[1]: libpod-conmon-9a86e0d8c29733c0e36f1c2a9b637cd8c5301e546e42c171addadb95de233c06.scope: Deactivated successfully. Feb 23 04:48:47 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:47 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:47 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Feb 23 04:48:47 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Feb 23 04:48:47 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Feb 23 04:48:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:48:47 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:47 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:47 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:48:47 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:48:47 localhost podman[301915]: Feb 23 04:48:47 localhost podman[301915]: 2026-02-23 09:48:47.747270591 +0000 UTC m=+0.071761775 container create 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-type=git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:48:47 localhost systemd[1]: Started libpod-conmon-99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb.scope. Feb 23 04:48:47 localhost systemd[1]: Started libcrun container. Feb 23 04:48:47 localhost podman[301915]: 2026-02-23 09:48:47.804978127 +0000 UTC m=+0.129469311 container init 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.buildah.version=1.42.2, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main) Feb 23 04:48:47 localhost podman[301915]: 2026-02-23 09:48:47.814843698 +0000 UTC m=+0.139334892 container start 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, version=7) Feb 23 04:48:47 localhost podman[301915]: 2026-02-23 09:48:47.815131526 +0000 UTC m=+0.139622750 container attach 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2026-02-09T10:25:24Z) Feb 23 04:48:47 localhost podman[301915]: 2026-02-23 09:48:47.718230717 +0000 UTC m=+0.042721911 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:47 localhost happy_jepsen[301930]: 167 167 Feb 23 04:48:47 localhost systemd[1]: libpod-99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb.scope: Deactivated successfully. Feb 23 04:48:47 localhost podman[301915]: 2026-02-23 09:48:47.820028465 +0000 UTC m=+0.144519669 container died 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:48:47 localhost systemd[1]: var-lib-containers-storage-overlay-6ccc8cd821e0e77de2133330dd6e4ebb587fdbc52d1d1bb0b6c7f2ddb14a27de-merged.mount: Deactivated successfully. Feb 23 04:48:47 localhost ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:48:47 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:48:47 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:47 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:47 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:48:47 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:47 localhost systemd[1]: tmp-crun.nZg3t9.mount: Deactivated successfully. Feb 23 04:48:47 localhost systemd[1]: var-lib-containers-storage-overlay-f924e07e34d7fc902ec0d8619ac4a9040361719cf6ecd2af6fe513a29e2ad9aa-merged.mount: Deactivated successfully. Feb 23 04:48:47 localhost podman[301935]: 2026-02-23 09:48:47.924300509 +0000 UTC m=+0.088678851 container remove 99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_jepsen, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, distribution-scope=public, io.openshift.expose-services=, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:48:47 localhost systemd[1]: libpod-conmon-99386ab6fe744e6cd82d1d14f586f756ededeb69d9f4787327cbc66ef9c27dfb.scope: Deactivated successfully. Feb 23 04:48:48 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:48 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:48 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Feb 23 04:48:48 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Feb 23 04:48:48 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Feb 23 04:48:48 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:48:48 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:48 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:48 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:48:48 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:48:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:48:48.551 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:48:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:48:48.552 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:48:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:48:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:48:48 localhost podman[302010]: Feb 23 04:48:48 localhost podman[302010]: 2026-02-23 09:48:48.713372623 +0000 UTC m=+0.074207809 container create 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, distribution-scope=public, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1770267347, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:48:48 localhost systemd[1]: Started libpod-conmon-8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f.scope. Feb 23 04:48:48 localhost systemd[1]: Started libcrun container. Feb 23 04:48:48 localhost podman[302010]: 2026-02-23 09:48:48.780007911 +0000 UTC m=+0.140843107 container init 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, vcs-type=git, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Feb 23 04:48:48 localhost podman[302010]: 2026-02-23 09:48:48.684820505 +0000 UTC m=+0.045655741 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:48 localhost podman[302010]: 2026-02-23 09:48:48.789119359 +0000 UTC m=+0.149954555 container start 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1770267347, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main) Feb 23 04:48:48 localhost podman[302010]: 2026-02-23 09:48:48.789350955 +0000 UTC m=+0.150186191 container attach 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, name=rhceph, version=7, io.buildah.version=1.42.2, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, CEPH_POINT_RELEASE=, release=1770267347) Feb 23 04:48:48 localhost serene_goldwasser[302025]: 167 167 Feb 23 04:48:48 localhost systemd[1]: libpod-8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f.scope: Deactivated successfully. Feb 23 04:48:48 localhost podman[302010]: 2026-02-23 09:48:48.791530902 +0000 UTC m=+0.152366118 container died 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhceph, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:48:48 localhost systemd[1]: var-lib-containers-storage-overlay-23c6762863234f041e65f11cb4d3527d94b7f66d53f48e5e21b871f6dec6e3ea-merged.mount: Deactivated successfully. Feb 23 04:48:48 localhost ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:48:48 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:48:48 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:48 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:48 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:48:48 localhost podman[302030]: 2026-02-23 09:48:48.885000697 +0000 UTC m=+0.085136902 container remove 8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_goldwasser, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:48:48 localhost systemd[1]: libpod-conmon-8944d372df72257636a21571f316571e9b0c25ec8429c36e404ff70d7520ae5f.scope: Deactivated successfully. Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:49 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:48:49 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 23 04:48:49 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:49 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:49 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:48:49 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:48:49 localhost podman[302105]: Feb 23 04:48:49 localhost podman[302105]: 2026-02-23 09:48:49.682784807 +0000 UTC m=+0.082925004 container create f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_BRANCH=main, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, version=7, io.buildah.version=1.42.2, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vendor=Red Hat, Inc., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:48:49 localhost systemd[1]: Started libpod-conmon-f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92.scope. Feb 23 04:48:49 localhost systemd[1]: Started libcrun container. Feb 23 04:48:49 localhost podman[302105]: 2026-02-23 09:48:49.743501355 +0000 UTC m=+0.143641542 container init f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Feb 23 04:48:49 localhost podman[302105]: 2026-02-23 09:48:49.646367609 +0000 UTC m=+0.046507826 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:49 localhost podman[302105]: 2026-02-23 09:48:49.752114437 +0000 UTC m=+0.152254634 container start f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:48:49 localhost podman[302105]: 2026-02-23 09:48:49.752408756 +0000 UTC m=+0.152548983 container attach f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, architecture=x86_64, GIT_CLEAN=True, version=7, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:48:49 localhost nervous_pare[302120]: 167 167 Feb 23 04:48:49 localhost systemd[1]: libpod-f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92.scope: Deactivated successfully. Feb 23 04:48:49 localhost podman[302105]: 2026-02-23 09:48:49.755188061 +0000 UTC m=+0.155328278 container died f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.openshift.expose-services=, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2) Feb 23 04:48:49 localhost systemd[1]: var-lib-containers-storage-overlay-fd6e93a01c30011113dbcf5b715246af0fcf1806fa795e42fe054e804dfb8762-merged.mount: Deactivated successfully. Feb 23 04:48:49 localhost podman[302125]: 2026-02-23 09:48:49.849016867 +0000 UTC m=+0.084937117 container remove f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_pare, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-02-09T10:25:24Z) Feb 23 04:48:49 localhost systemd[1]: libpod-conmon-f5d6dbd1b4bab38207fdde3adcc4afc2337416115e621ab9fc53fe54aca62f92.scope: Deactivated successfully. Feb 23 04:48:49 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:49 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:48:49 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:48:49 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 23 04:48:49 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch Feb 23 04:48:49 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:49 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:49 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:48:49 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:48:50 localhost ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)... Feb 23 04:48:50 localhost ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:48:50 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:50 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:50 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:48:50 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:48:50 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:50 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:50 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:50 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:50 localhost podman[302195]: Feb 23 04:48:50 localhost podman[302195]: 2026-02-23 09:48:50.582640895 +0000 UTC m=+0.080467740 container create 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, architecture=x86_64, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:48:50 localhost systemd[1]: Started libpod-conmon-2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0.scope. Feb 23 04:48:50 localhost systemd[1]: Started libcrun container. Feb 23 04:48:50 localhost podman[302195]: 2026-02-23 09:48:50.542828512 +0000 UTC m=+0.040655417 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:50 localhost podman[302195]: 2026-02-23 09:48:50.645508598 +0000 UTC m=+0.143335443 container init 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, distribution-scope=public, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , release=1770267347) Feb 23 04:48:50 localhost podman[302195]: 2026-02-23 09:48:50.654643556 +0000 UTC m=+0.152470401 container start 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, release=1770267347, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:48:50 localhost podman[302195]: 2026-02-23 09:48:50.654899733 +0000 UTC m=+0.152726588 container attach 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, com.redhat.component=rhceph-container, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:48:50 localhost modest_lehmann[302210]: 167 167 Feb 23 04:48:50 localhost systemd[1]: libpod-2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0.scope: Deactivated successfully. Feb 23 04:48:50 localhost podman[302195]: 2026-02-23 09:48:50.658331718 +0000 UTC m=+0.156158573 container died 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, distribution-scope=public, release=1770267347, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, RELEASE=main) Feb 23 04:48:50 localhost podman[302215]: 2026-02-23 09:48:50.750276046 +0000 UTC m=+0.083652356 container remove 2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_lehmann, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.42.2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=) Feb 23 04:48:50 localhost systemd[1]: libpod-conmon-2b9d26d78f3be7756286cd09de8328fc7c5c3f39df4e4afa226164a52e5043c0.scope: Deactivated successfully. Feb 23 04:48:50 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:50 localhost nova_compute[282206]: 2026-02-23 09:48:50.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:50 localhost nova_compute[282206]: 2026-02-23 09:48:50.823 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:50 localhost nova_compute[282206]: 2026-02-23 09:48:50.824 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:50 localhost nova_compute[282206]: 2026-02-23 09:48:50.824 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:50 localhost systemd[1]: var-lib-containers-storage-overlay-e588d0526933e2b45d6bb4ed1c5c80351e4c69b75a09351cd1fbffcc2bbb3e6f-merged.mount: Deactivated successfully. Feb 23 04:48:50 localhost nova_compute[282206]: 2026-02-23 09:48:50.860 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:50 localhost nova_compute[282206]: 2026-02-23 09:48:50.860 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:50 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:50 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:48:50 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:48:50 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 23 04:48:50 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:50 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 23 04:48:50 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 23 04:48:50 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:50 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:50 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:48:50 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:48:51 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:48:51 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:48:51 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:48:51 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:48:51 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:51 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:51 localhost ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:48:51 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:51 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:48:51 localhost ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:51 localhost podman[302284]: Feb 23 04:48:51 localhost podman[302284]: 2026-02-23 09:48:51.489503424 +0000 UTC m=+0.074508098 container create 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_BRANCH=main) Feb 23 04:48:51 localhost systemd[1]: Started libpod-conmon-9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce.scope. Feb 23 04:48:51 localhost systemd[1]: Started libcrun container. Feb 23 04:48:51 localhost podman[302284]: 2026-02-23 09:48:51.552053059 +0000 UTC m=+0.137057733 container init 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, name=rhceph, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z) Feb 23 04:48:51 localhost podman[302284]: 2026-02-23 09:48:51.460022078 +0000 UTC m=+0.045026812 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:48:51 localhost blissful_napier[302299]: 167 167 Feb 23 04:48:51 localhost podman[302284]: 2026-02-23 09:48:51.56425311 +0000 UTC m=+0.149257784 container start 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, version=7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, release=1770267347, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:48:51 localhost systemd[1]: libpod-9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce.scope: Deactivated successfully. Feb 23 04:48:51 localhost podman[302284]: 2026-02-23 09:48:51.564546658 +0000 UTC m=+0.149551372 container attach 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-type=git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, name=rhceph, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:48:51 localhost podman[302284]: 2026-02-23 09:48:51.566694564 +0000 UTC m=+0.151699288 container died 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1770267347, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, version=7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph) Feb 23 04:48:51 localhost podman[302304]: 2026-02-23 09:48:51.658820248 +0000 UTC m=+0.085019299 container remove 9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_napier, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, release=1770267347, RELEASE=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, vendor=Red Hat, Inc., io.openshift.expose-services=) Feb 23 04:48:51 localhost systemd[1]: libpod-conmon-9672a74d3d6f94bf556146361587208e3032672f5878b20c1914ece882df91ce.scope: Deactivated successfully. Feb 23 04:48:51 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:48:51 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:48:51 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:48:51 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:48:51 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:48:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:51 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:51 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:51 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:48:51 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:48:51 localhost systemd[1]: var-lib-containers-storage-overlay-8dd0e2f6588fd66eb7098c0550d966c9f1413363904e613588fc090ca2842de3-merged.mount: Deactivated successfully. Feb 23 04:48:51 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:52 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:52 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:52 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Feb 23 04:48:52 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Feb 23 04:48:52 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Feb 23 04:48:52 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:48:52 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:52 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:52 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:48:52 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:48:52 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:52 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:52 localhost ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:48:52 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:52 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:52 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:48:52 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:52 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:52 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:48:53 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:53 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:53 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Feb 23 04:48:53 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Feb 23 04:48:53 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:48:53 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Feb 23 04:48:53 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:53 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:53 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:48:53 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:48:53 localhost ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:48:53 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:48:53 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:53 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:53 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:48:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:48:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:48:53 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:53 localhost podman[302322]: 2026-02-23 09:48:53.912691454 +0000 UTC m=+0.083517752 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:48:53 localhost podman[302321]: 2026-02-23 09:48:53.960889732 +0000 UTC m=+0.132811714 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Feb 23 04:48:53 localhost podman[302322]: 2026-02-23 09:48:53.979360644 +0000 UTC m=+0.150186892 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:48:54 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:48:54 localhost podman[302321]: 2026-02-23 09:48:54.036262576 +0000 UTC m=+0.208184588 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Feb 23 04:48:54 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:48:54 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:54 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:54 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:48:54 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:48:54 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 23 04:48:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:48:54 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:54 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:54 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:48:54 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:48:54 localhost ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:48:54 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:48:54 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:54 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:54 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:48:54 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:48:55 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:55 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:55 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:48:55 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:48:55 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:48:55 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:55 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 23 04:48:55 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch Feb 23 04:48:55 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:55 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:55 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:48:55 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:48:55 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:48:55 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:48:55 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:55 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:55 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:55 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:48:55 localhost nova_compute[282206]: 2026-02-23 09:48:55.861 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:55 localhost nova_compute[282206]: 2026-02-23 09:48:55.863 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:48:55 localhost nova_compute[282206]: 2026-02-23 09:48:55.863 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:48:55 localhost nova_compute[282206]: 2026-02-23 09:48:55.864 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:55 localhost nova_compute[282206]: 2026-02-23 09:48:55.886 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:48:55 localhost nova_compute[282206]: 2026-02-23 09:48:55.887 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:48:55 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:56 localhost ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.139 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcf06184-990d-42d6-91aa-84651e9a4ebe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.135834', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'dea3a4a2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '2d7a93e8672f7cdb8c697f12458572b096a4b0c06cc52b64c3a11f6f75f808a7'}]}, 'timestamp': '2026-02-23 09:48:56.140827', '_unique_id': 'ffea7ba7d9f0457c8f4a72e8584aee02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.142 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.172 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2930937-7dfb-4424-ab39-634d90162cd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.143809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dea8a1b4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'd5ecb5de55d57cdc9bd3051fc6548088495f98215baeddf619a23a4a65fae245'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.143809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dea8b758-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'd3da68b32dcc75bab832ade7d8c58b6eb9b5f8f107c126c66c34ce2cdcc4086b'}]}, 'timestamp': '2026-02-23 09:48:56.174085', '_unique_id': 'fa43ed2e167042c6a42e38a4ae1f3514'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.176 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.177 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2420f8cf-0458-4d09-b7d2-9f6377ed260e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.176609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dea936a6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'c062af07a614954ab17cae31513d50bfba31755d49b2a3337df5c297119f4d8a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.176609', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dea94fc4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '5a7d974109475c362934bdc5a1e279bce5e53d0d090b7beaf16f0e3ca350fada'}]}, 'timestamp': '2026-02-23 09:48:56.178009', '_unique_id': '34ac02308f6a49fbb316689bfa09bc7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.179 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.181 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.181 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.182 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca092a8f-8484-417d-b8e3-53b86ef3fa92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.181476', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dea9f276-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '74da2ad9d45711689df562f8d9321bfbf84f18bb03bff63a330f7b85a526f541'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.181476', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deaa0c16-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '77f3adbc6f5a30f29376e0d78d8fff13fae4c896feb25ee1ba41e8f354230eba'}]}, 'timestamp': '2026-02-23 09:48:56.182792', '_unique_id': '3425f55ebbc743acb6d7db07c0d2c017'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.185 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 11460000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8ced9ae-7b97-4f2f-8000-7ff1bf081c8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11460000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:48:56.186043', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'deaceef4-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.390507412, 'message_signature': '6c56c548a1c1aa03784069f163f22e2f8c19ece221f5f46c47fc4471fd97b629'}]}, 'timestamp': '2026-02-23 09:48:56.201680', '_unique_id': '7ca5d72f653d4a2e9e3f5034a68280a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.202 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.213 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.214 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84424edf-4d17-49d6-91bb-f75c9b4099a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.204053', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deaee470-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '5fdd84bacd818eba76300a7e6474a4970dbfe076449aabc49f252360e281d192'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.204053', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deaef8b6-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '37c41befd37b852505e8248b719e58904a7177dcdcb79d3b26316e1989690ed1'}]}, 'timestamp': '2026-02-23 09:48:56.215058', '_unique_id': 'f17144799db848c1ae7caff4cdb9fdf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.216 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.217 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '40cac3ce-43ae-4711-9617-e6f57a70b1ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.217951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deaf83b2-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '3a8fd82d3941757ab172ff835d5fe895c9b581c26cdbe696e95f5fcf5b30712a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.217951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deaf9d84-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'b7dfcf078219e71767f99f4943d3f3475ab5796db8e0b6e83388104ff44abf36'}]}, 'timestamp': '2026-02-23 09:48:56.219286', '_unique_id': '7ff708b6b841499a89a9e8eae8666d07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.220 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.222 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe7bd427-e7ce-4c18-8457-2364b59ea463', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.222510', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb035a0-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'a344fb1639bdf2a21184a2b8363132fbaecef784b2e03f3f54dd521ffdc2fd03'}]}, 'timestamp': '2026-02-23 09:48:56.223253', '_unique_id': '874597120c3f4ba5af5873d2b99c1510'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.226 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '032b8d73-3d8a-4e94-a98f-3efef62b46c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.226948', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb0e34c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'babc36d92ffdd855cc36dc3e96a490a3e10ad18702edae29b70791ecfcf5d16f'}]}, 'timestamp': '2026-02-23 09:48:56.227660', '_unique_id': 'e0193ffa4a6c4423841cdc5797134d85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.230 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '511374e9-230b-4e2e-9a2d-6df7db48fd1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.230807', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb17b4a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '8bb29df29ceaf3d3d5dc6ec32c2c424478b65118a119d61362a23de3c74116b6'}]}, 'timestamp': '2026-02-23 09:48:56.231564', '_unique_id': '9be6122fc22245bb9439fe5ece5faa18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b215bbda-0a99-4aa1-adf9-6662599d0f59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.234151', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb1f764-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'ced36dd701c1777e2835810b10449d5f8209aab7fde83d1ab7e5753f672aa881'}]}, 'timestamp': '2026-02-23 09:48:56.234610', '_unique_id': '10431a8ca0034d91adc5fdbb99aa77fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0aee4ae-814d-44ea-b411-b5cd348354a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:48:56.236689', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'deb25bdc-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.390507412, 'message_signature': '3add3044e49ea38ef6b07d025d4766932053d9622e1467b48d537e2d495df2d1'}]}, 'timestamp': '2026-02-23 09:48:56.237168', '_unique_id': '2f8129d4c1c74d3f9bca9784e6499712'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ca3a54d-288d-44ff-9511-c5199b935bdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.239216', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb2bd0c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '822b8657a7059213ceb214dd6f4c791497a9071fe5b1b72d3bee14cbb5e955cc'}]}, 'timestamp': '2026-02-23 09:48:56.239667', '_unique_id': 'da5cea82699f463fb8807dba6b067189'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '813e4f2b-8a0d-4b25-9ef3-f9ccc8452650', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.241742', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb320ee-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '79a66b048496b66f75440c120fbaafef2c4ae2b93557beed97972387ec0dc9fd'}]}, 'timestamp': '2026-02-23 09:48:56.242223', '_unique_id': '3622dff546ba4a24b219f1116956733b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16ee4dda-4bc7-4ac4-9045-5748fc6f44d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.244270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deb38232-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '25c6c4ccd063ca0cdd4a4b5df70f23e7a57543428e327580a92634d2664ef14c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.244270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deb393ee-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': '1ac6bc53a4ae0c30f78c58f0e45c1589a533c27e3e8e4800d0d7beb87037d764'}]}, 'timestamp': '2026-02-23 09:48:56.245149', '_unique_id': '8d7408caab1741d393e5006ef7c21a47'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5529999e-8462-440b-a039-9b42fbbedebf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.247244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deb3f654-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '5b9d3bc5c193b73d90b19a85ec775bf1d620ab4f070518548d406f6e2c120617'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.247244', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deb40612-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '5ee3e4bcd7b04b2b4ae3815953566c071c0e75d4145e6c6fefc67d73b92777cd'}]}, 'timestamp': '2026-02-23 09:48:56.248092', '_unique_id': 'd3b0dedb577f4423b2fd4cf4ab8b2510'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.249 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e42feefc-e886-4b12-a8ba-183ab81b6bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.250187', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb4663e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'eaab39345bc7cf8cb0987e44b5218677f4caa77a27da219b2849d947503afa1d'}]}, 'timestamp': '2026-02-23 09:48:56.250465', '_unique_id': 'be120899a7334b33be2722675d494655'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e973ccf8-aa3c-4bf2-a1dc-865f234fdd8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.251725', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb4a248-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': '57dd8c40e9d4595bcc6c9a6ec4a2c9164dd27d9cd5394c27b90227a65abd8224'}]}, 'timestamp': '2026-02-23 09:48:56.252024', '_unique_id': 'e11f300322084809bbf1b8e9b369abac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fbc0759-e266-4781-97ad-9171e2f17ced', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.253297', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deb4df9c-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'a3456c5ba8ee2335265899029df5de46b9ee3063f678424a69ccef4560e707dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.253297', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deb4e988-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.333350432, 'message_signature': 'a5a9198cc70f95e475c9bf7e35a4d024e6c45d264d36a6b7d83a4b601fab2df5'}]}, 'timestamp': '2026-02-23 09:48:56.253807', '_unique_id': '1cae021d530b474f88695916d301cfb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.254 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8d72495-cdc4-43c3-a22b-2d857731d773', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:48:56.255123', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'deb52718-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.32537355, 'message_signature': 'ffb12ca417bf20bdba08189a2ca89835b2b80e6670f8bb8b04f9c442367be499'}]}, 'timestamp': '2026-02-23 09:48:56.255402', '_unique_id': '1c1d3d7bad784b6da98c079f598833eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed0de983-4df3-4f75-95df-6aac5198b483', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:48:56.256647', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'deb5626e-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': 'fb467dc5bad82ccf63064900f4ec01b2464c7a56403b022a5a504b49b288a334'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:48:56.256647', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'deb56f7a-109c-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11576.393561005, 'message_signature': '7474be0551220ae1943323e7a8ceedc71974ffb39edee3ab119eb8a7aa4c8eed'}]}, 'timestamp': '2026-02-23 09:48:56.257238', '_unique_id': '00ab5926e34a45968066099aff9e6d39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:48:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:48:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 04:48:56 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:56 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:56 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626465 (monmap changed)... Feb 23 04:48:56 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626465 (monmap changed)... Feb 23 04:48:56 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 23 04:48:56 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:56 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 23 04:48:56 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 23 04:48:56 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:56 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:56 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:48:56 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:48:56 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:48:56 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:48:56 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:56 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:56 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:48:57 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:48:57 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:48:57 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:48:57 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:48:57 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:48:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:57 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:57 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:57 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:48:57 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:48:57 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34500 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005626461.localdomain", "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:48:57 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 23 04:48:57 localhost ceph-mgr[288036]: [cephadm INFO root] Added label _no_schedule to host np0005626461.localdomain Feb 23 04:48:57 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005626461.localdomain Feb 23 04:48:57 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 23 04:48:57 localhost ceph-mgr[288036]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626461.localdomain Feb 23 04:48:57 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626461.localdomain Feb 23 04:48:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:48:57 localhost ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)... Feb 23 04:48:57 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:48:57 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:57 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:57 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:57 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:48:57 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:57 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:57 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:48:57 localhost systemd[1]: tmp-crun.6rclLq.mount: Deactivated successfully. Feb 23 04:48:57 localhost podman[302370]: 2026-02-23 09:48:57.914789138 +0000 UTC m=+0.089484985 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 23 04:48:57 localhost podman[302370]: 2026-02-23 09:48:57.929288809 +0000 UTC m=+0.103984656 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute) Feb 23 04:48:57 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:48:58 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:58 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:58 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Feb 23 04:48:58 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Feb 23 04:48:58 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Feb 23 04:48:58 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:48:58 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:58 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:58 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:48:58 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:48:58 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.34506 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005626461.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 23 04:48:59 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:48:59 localhost ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:48:59 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:48:59 localhost ceph-mon[294160]: Added label _no_schedule to host np0005626461.localdomain Feb 23 04:48:59 localhost ceph-mon[294160]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005626461.localdomain Feb 23 04:48:59 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:59 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:48:59 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:48:59 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:48:59 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Feb 23 04:48:59 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Feb 23 04:48:59 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Feb 23 04:48:59 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:48:59 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:48:59 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:48:59 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:48:59 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:48:59 localhost sshd[302390]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:48:59 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:00 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44431 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005626461.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:49:00 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Feb 23 04:49:00 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"} v 0) Feb 23 04:49:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"} : dispatch Feb 23 04:49:00 localhost ceph-mgr[288036]: [cephadm INFO root] Removed host np0005626461.localdomain Feb 23 04:49:00 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removed host np0005626461.localdomain Feb 23 04:49:00 localhost ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:49:00 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:49:00 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:00 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:00 localhost ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)... Feb 23 04:49:00 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:49:00 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:49:00 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:00 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"} : dispatch Feb 23 04:49:00 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"} : dispatch Feb 23 04:49:00 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain"}]': finished Feb 23 04:49:00 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:00 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:00 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:49:00 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:49:00 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 23 04:49:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:00 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:00 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:00 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:49:00 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:49:00 localhost nova_compute[282206]: 2026-02-23 09:49:00.888 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:00 localhost nova_compute[282206]: 2026-02-23 09:49:00.891 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:00 localhost nova_compute[282206]: 2026-02-23 09:49:00.891 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:00 localhost nova_compute[282206]: 2026-02-23 09:49:00.891 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:00 localhost nova_compute[282206]: 2026-02-23 09:49:00.928 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:00 localhost nova_compute[282206]: 2026-02-23 09:49:00.929 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:01 localhost ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:49:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5308 writes, 23K keys, 5308 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5308 writes, 741 syncs, 7.16 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 156 writes, 536 keys, 156 commit groups, 1.0 writes per commit group, ingest: 0.64 MB, 0.00 MB/s#012Interval WAL: 156 writes, 62 syncs, 2.52 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 04:49:01 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:01 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:01 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:49:01 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:49:01 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:49:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:01 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 23 04:49:01 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch Feb 23 04:49:01 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:01 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:01 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:49:01 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:49:01 localhost ceph-mon[294160]: Removed host np0005626461.localdomain Feb 23 04:49:01 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:01 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:01 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:49:01 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:01 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:01 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:49:01 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:01 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:01 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:01 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:49:01 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:01 localhost podman[302392]: 2026-02-23 09:49:01.9093399 +0000 UTC m=+0.080395848 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Feb 23 04:49:01 localhost podman[302392]: 2026-02-23 09:49:01.943236531 +0000 UTC m=+0.114292439 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Feb 23 04:49:01 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:49:02 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:02 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:02 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:49:02 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:49:02 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 23 04:49:02 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:49:02 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Feb 23 04:49:02 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Feb 23 04:49:02 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:02 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:02 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:49:02 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:49:02 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:49:02 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:49:02 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:02 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:02 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:49:03 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:03 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:03 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:03 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:03 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 23 04:49:03 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:03 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:03 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:03 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:03 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:03 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:03 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:03 localhost ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:49:03 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:49:03 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:03 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:03 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:03 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:04 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:04 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:04 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:04 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:04 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:04 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:04 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:04 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:04 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:04 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:04 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:04 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:04 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:04 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:04 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:04 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:49:04 localhost ceph-mgr[288036]: [progress INFO root] update: starting ev a9ba5473-cf2f-46a7-90d2-1ffa422d6109 (Updating node-proxy deployment (+3 -> 3)) Feb 23 04:49:04 localhost ceph-mgr[288036]: [progress INFO root] complete: finished ev a9ba5473-cf2f-46a7-90d2-1ffa422d6109 (Updating node-proxy deployment (+3 -> 3)) Feb 23 04:49:04 localhost ceph-mgr[288036]: [progress INFO root] Completed event a9ba5473-cf2f-46a7-90d2-1ffa422d6109 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Feb 23 04:49:04 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 23 04:49:04 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 23 04:49:05 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:05 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:05 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:05 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:05 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:05 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:05 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:05 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:05 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:05 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:05 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:05 localhost nova_compute[282206]: 2026-02-23 09:49:05.929 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:05 localhost nova_compute[282206]: 2026-02-23 09:49:05.931 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:05 localhost nova_compute[282206]: 2026-02-23 09:49:05.932 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:05 localhost nova_compute[282206]: 2026-02-23 09:49:05.932 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:05 localhost nova_compute[282206]: 2026-02-23 09:49:05.973 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:05 localhost nova_compute[282206]: 2026-02-23 09:49:05.974 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:06 localhost ceph-mon[294160]: mon.np0005626463@1(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:49:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5568 writes, 24K keys, 5568 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5568 writes, 778 syncs, 7.16 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 147 writes, 320 keys, 147 commit groups, 1.0 writes per commit group, ingest: 0.43 MB, 0.00 MB/s#012Interval WAL: 147 writes, 73 syncs, 2.01 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 04:49:07 localhost ceph-mgr[288036]: [balancer INFO root] Optimize plan auto_2026-02-23_09:49:07 Feb 23 04:49:07 localhost ceph-mgr[288036]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Feb 23 04:49:07 localhost ceph-mgr[288036]: [balancer INFO root] do_upmap Feb 23 04:49:07 localhost ceph-mgr[288036]: [balancer INFO root] pools ['manila_data', 'manila_metadata', '.mgr', 'backups', 'vms', 'volumes', 'images'] Feb 23 04:49:07 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:07 localhost ceph-mgr[288036]: [balancer INFO root] prepared 0/10 changes Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] _maybe_adjust Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1) Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.0033329080297319932 of space, bias 1.0, pg target 0.6665816059463986 quantized to 32 (current 32) Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32) Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Feb 23 04:49:08 localhost ceph-mgr[288036]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16) Feb 23 04:49:08 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:49:08 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [] Feb 23 04:49:08 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:49:08 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [] Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 23 04:49:08 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:49:08 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [] Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: images, start_after= Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: vms, start_after= Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: volumes, start_after= Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: images, start_after= Feb 23 04:49:08 localhost ceph-mgr[288036]: [rbd_support INFO root] load_schedules: backups, start_after= Feb 23 04:49:08 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44449 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:49:08 localhost ceph-mgr[288036]: [cephadm INFO root] Saving service mon spec with placement label:mon Feb 23 04:49:08 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Feb 23 04:49:08 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 23 04:49:08 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:08 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:08 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 23 04:49:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:08 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:49:08 localhost ceph-mgr[288036]: [progress INFO root] update: starting ev 63db0af8-3d95-4b25-984e-41f9ee1ef319 (Updating node-proxy deployment (+3 -> 3)) Feb 23 04:49:08 localhost ceph-mgr[288036]: [progress INFO root] complete: finished ev 63db0af8-3d95-4b25-984e-41f9ee1ef319 (Updating node-proxy deployment (+3 -> 3)) Feb 23 04:49:08 localhost ceph-mgr[288036]: [progress INFO root] Completed event 63db0af8-3d95-4b25-984e-41f9ee1ef319 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Feb 23 04:49:08 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 23 04:49:08 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 23 04:49:08 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:08 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:08 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:09 localhost ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events Feb 23 04:49:09 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:49:09 localhost podman[242954]: time="2026-02-23T09:49:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:49:09 localhost podman[242954]: @ - - [23/Feb/2026:09:49:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:49:09 localhost podman[242954]: @ - - [23/Feb/2026:09:49:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18271 "" "Go-http-client/1.1" Feb 23 04:49:09 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.44452 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005626466", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Feb 23 04:49:09 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:09 localhost ceph-mon[294160]: Saving service mon spec with placement label:mon Feb 23 04:49:09 localhost ceph-mon[294160]: from='mgr.26614 ' entity='mgr.np0005626463.wtksup' Feb 23 04:49:10 localhost nova_compute[282206]: 2026-02-23 09:49:10.974 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:10 localhost nova_compute[282206]: 2026-02-23 09:49:10.976 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:10 localhost nova_compute[282206]: 2026-02-23 09:49:10.977 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:10 localhost nova_compute[282206]: 2026-02-23 09:49:10.977 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:10 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.54134 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005626466"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:49:10 localhost ceph-mgr[288036]: [cephadm INFO root] Remove daemons mon.np0005626466 Feb 23 04:49:10 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005626466 Feb 23 04:49:10 localhost ceph-mgr[288036]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005626466: new quorum should be ['np0005626463', 'np0005626465'] (from ['np0005626463', 'np0005626465']) Feb 23 04:49:10 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005626466: new quorum should be ['np0005626463', 'np0005626465'] (from ['np0005626463', 'np0005626465']) Feb 23 04:49:10 localhost ceph-mgr[288036]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005626466 from monmap... Feb 23 04:49:10 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removing monitor np0005626466 from monmap... Feb 23 04:49:10 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005626466 from np0005626466.localdomain -- ports [] Feb 23 04:49:10 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005626466 from np0005626466.localdomain -- ports [] Feb 23 04:49:11 localhost ceph-mgr[288036]: client.27096 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:49:11 localhost ceph-mgr[288036]: client.44327 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:49:11 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:49:11 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "quorum_status"} v 0) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "quorum_status"} : dispatch Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e13 handle_command mon_command({"prefix": "mon rm", "name": "np0005626466"} v 0) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon rm", "name": "np0005626466"} : dispatch Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@1(peon) e14 my rank is now 0 (was 1) Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@0(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@0(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:49:11 localhost nova_compute[282206]: 2026-02-23 09:49:11.073 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:11 localhost nova_compute[282206]: 2026-02-23 09:49:11.074 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@0(probing) e14 handle_auth_request failed to assign global_id Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:49:11 localhost ceph-mon[294160]: paxos.0).electionLogic(56) init, last seen epoch 56 Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 is new leader, mons np0005626463,np0005626465 in quorum (ranks 0,1) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : monmap epoch 14 Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : last_changed 2026-02-23T09:49:10.990173+0000 Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : created 2026-02-23T07:36:01.997603+0000 Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463 Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465 Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e84: 6 total, 6 up, 6 in Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e32: np0005626463.wtksup(active, since 63s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : overall HEALTH_OK Feb 23 04:49:11 localhost ceph-mon[294160]: Remove daemons mon.np0005626466 Feb 23 04:49:11 localhost ceph-mon[294160]: Safe to remove mon.np0005626466: new quorum should be ['np0005626463', 'np0005626465'] (from ['np0005626463', 'np0005626465']) Feb 23 04:49:11 localhost ceph-mon[294160]: Removing monitor np0005626466 from monmap... Feb 23 04:49:11 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon rm", "name": "np0005626466"} : dispatch Feb 23 04:49:11 localhost ceph-mon[294160]: Removing daemon mon.np0005626466 from np0005626466.localdomain -- ports [] Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626465 calling monitor election Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463 is new leader, mons np0005626463,np0005626465 in quorum (ranks 0,1) Feb 23 04:49:11 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/553441531' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:49:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 23 04:49:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:11 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:11 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:11 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:11 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:11 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:11 localhost podman[302766]: 2026-02-23 09:49:11.911372662 +0000 UTC m=+0.085571925 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:49:11 localhost podman[302766]: 2026-02-23 09:49:11.925284596 +0000 UTC m=+0.099483869 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:49:11 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:49:12 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:12 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:12 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:12 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:12 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:12 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:12 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:12 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:49:12 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3509329788' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:49:12 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:12 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:13 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:13 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:13 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost openstack_network_exporter[245358]: ERROR 09:49:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:49:13 localhost openstack_network_exporter[245358]: Feb 23 04:49:13 localhost openstack_network_exporter[245358]: ERROR 09:49:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:49:13 localhost openstack_network_exporter[245358]: Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:13 localhost ceph-mgr[288036]: [progress INFO root] update: starting ev 8cfed40c-2653-46f8-be56-0c09e5ecb3c2 (Updating node-proxy deployment (+3 -> 3)) Feb 23 04:49:13 localhost ceph-mgr[288036]: [progress INFO root] complete: finished ev 8cfed40c-2653-46f8-be56-0c09e5ecb3c2 (Updating node-proxy deployment (+3 -> 3)) Feb 23 04:49:13 localhost ceph-mgr[288036]: [progress INFO root] Completed event 8cfed40c-2653-46f8-be56-0c09e5ecb3c2 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 23 04:49:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:49:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:13 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:13 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:49:13 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:49:13 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:14 localhost nova_compute[282206]: 2026-02-23 09:49:14.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:14 localhost nova_compute[282206]: 2026-02-23 09:49:14.067 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:14 localhost nova_compute[282206]: 2026-02-23 09:49:14.068 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:49:14 localhost nova_compute[282206]: 2026-02-23 09:49:14.068 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:49:14 localhost ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events Feb 23 04:49:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:49:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:49:14 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:14 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:49:14 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost podman[303179]: Feb 23 04:49:14 localhost podman[303179]: 2026-02-23 09:49:14.426544001 +0000 UTC m=+0.080205323 container create ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, architecture=x86_64, vcs-type=git, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., release=1770267347, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True) Feb 23 04:49:14 localhost systemd[1]: Started libpod-conmon-ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204.scope. Feb 23 04:49:14 localhost systemd[1]: Started libcrun container. Feb 23 04:49:14 localhost podman[303179]: 2026-02-23 09:49:14.392248497 +0000 UTC m=+0.045909839 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:14 localhost podman[303179]: 2026-02-23 09:49:14.494081176 +0000 UTC m=+0.147742508 container init ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux , release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main) Feb 23 04:49:14 localhost podman[303179]: 2026-02-23 09:49:14.503411861 +0000 UTC m=+0.157073183 container start ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64, release=1770267347, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:49:14 localhost podman[303179]: 2026-02-23 09:49:14.503754451 +0000 UTC m=+0.157415813 container attach ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, distribution-scope=public, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:49:14 localhost stoic_rubin[303195]: 167 167 Feb 23 04:49:14 localhost systemd[1]: libpod-ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204.scope: Deactivated successfully. Feb 23 04:49:14 localhost podman[303179]: 2026-02-23 09:49:14.506713601 +0000 UTC m=+0.160374963 container died ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, vcs-type=git, release=1770267347, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:49:14 localhost podman[303200]: 2026-02-23 09:49:14.61413383 +0000 UTC m=+0.091736593 container remove ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_rubin, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:49:14 localhost systemd[1]: libpod-conmon-ad8c7e6dd1ae3e7bf383da575a6a7ad26565460a029aa291408a1c3c182d8204.scope: Deactivated successfully. Feb 23 04:49:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:14 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Feb 23 04:49:14 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Feb 23 04:49:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Feb 23 04:49:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:49:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:14 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:14 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:49:14 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:49:14 localhost nova_compute[282206]: 2026-02-23 09:49:14.831 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:49:14 localhost nova_compute[282206]: 2026-02-23 09:49:14.833 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:49:14 localhost nova_compute[282206]: 2026-02-23 09:49:14.833 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:49:14 localhost nova_compute[282206]: 2026-02-23 09:49:14.834 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:49:15 localhost nova_compute[282206]: 2026-02-23 09:49:15.185 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:49:15 localhost nova_compute[282206]: 2026-02-23 09:49:15.199 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:49:15 localhost nova_compute[282206]: 2026-02-23 09:49:15.199 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:49:15 localhost nova_compute[282206]: 2026-02-23 09:49:15.200 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:49:15 localhost podman[303267]: 2026-02-23 09:49:15.358218247 +0000 UTC m=+0.078051127 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 04:49:15 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:15 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:15 localhost ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:49:15 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:49:15 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:49:15 localhost podman[303267]: 2026-02-23 09:49:15.374280205 +0000 UTC m=+0.094113065 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.) Feb 23 04:49:15 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:49:15 localhost podman[303275]: Feb 23 04:49:15 localhost podman[303275]: 2026-02-23 09:49:15.424964337 +0000 UTC m=+0.123553991 container create 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, release=1770267347, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , name=rhceph, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:49:15 localhost systemd[1]: var-lib-containers-storage-overlay-8ab0d3838dc42168bb770c38c550942bcf158f406d168831e05d2ea78e499fb0-merged.mount: Deactivated successfully. Feb 23 04:49:15 localhost systemd[1]: Started libpod-conmon-893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742.scope. Feb 23 04:49:15 localhost systemd[1]: Started libcrun container. Feb 23 04:49:15 localhost podman[303275]: 2026-02-23 09:49:15.486102178 +0000 UTC m=+0.184691832 container init 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, name=rhceph, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main) Feb 23 04:49:15 localhost podman[303275]: 2026-02-23 09:49:15.394569093 +0000 UTC m=+0.093158797 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:15 localhost trusting_wozniak[303305]: 167 167 Feb 23 04:49:15 localhost podman[303275]: 2026-02-23 09:49:15.494624877 +0000 UTC m=+0.193214531 container start 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, distribution-scope=public, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7) Feb 23 04:49:15 localhost systemd[1]: libpod-893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742.scope: Deactivated successfully. Feb 23 04:49:15 localhost podman[303275]: 2026-02-23 09:49:15.494937017 +0000 UTC m=+0.193526721 container attach 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=) Feb 23 04:49:15 localhost podman[303275]: 2026-02-23 09:49:15.497179275 +0000 UTC m=+0.195768949 container died 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main) Feb 23 04:49:15 localhost podman[303310]: 2026-02-23 09:49:15.586794353 +0000 UTC m=+0.081815081 container remove 893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_wozniak, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, release=1770267347) Feb 23 04:49:15 localhost systemd[1]: libpod-conmon-893ab8997ef7a75ef8cb1833e1ae3be2679d26bb82011db736f00d40e7b35742.scope: Deactivated successfully. Feb 23 04:49:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:15 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:15 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:15 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Feb 23 04:49:15 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Feb 23 04:49:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Feb 23 04:49:15 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:49:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:15 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:15 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:49:15 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:49:15 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:16 localhost nova_compute[282206]: 2026-02-23 09:49:16.074 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:16 localhost nova_compute[282206]: 2026-02-23 09:49:16.108 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:16 localhost nova_compute[282206]: 2026-02-23 09:49:16.108 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:16 localhost nova_compute[282206]: 2026-02-23 09:49:16.108 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:16 localhost nova_compute[282206]: 2026-02-23 09:49:16.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:16 localhost nova_compute[282206]: 2026-02-23 09:49:16.111 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:16 localhost podman[303386]: Feb 23 04:49:16 localhost systemd[1]: var-lib-containers-storage-overlay-0e289e261848ed4c1c2482eaf8df4286950606cb840cd763a5abd9c3a2de1513-merged.mount: Deactivated successfully. Feb 23 04:49:16 localhost podman[303386]: 2026-02-23 09:49:16.434219514 +0000 UTC m=+0.085872665 container create cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , ceph=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:49:16 localhost systemd[1]: Started libpod-conmon-cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759.scope. Feb 23 04:49:16 localhost systemd[1]: Started libcrun container. Feb 23 04:49:16 localhost podman[303386]: 2026-02-23 09:49:16.394225446 +0000 UTC m=+0.045878597 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:16 localhost podman[303386]: 2026-02-23 09:49:16.502807301 +0000 UTC m=+0.154460452 container init cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, com.redhat.component=rhceph-container) Feb 23 04:49:16 localhost podman[303386]: 2026-02-23 09:49:16.513576619 +0000 UTC m=+0.165229780 container start cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:49:16 localhost podman[303386]: 2026-02-23 09:49:16.513861688 +0000 UTC m=+0.165514849 container attach cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, version=7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:49:16 localhost blissful_greider[303401]: 167 167 Feb 23 04:49:16 localhost podman[303386]: 2026-02-23 09:49:16.516939391 +0000 UTC m=+0.168592562 container died cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux , release=1770267347, io.buildah.version=1.42.2, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:49:16 localhost systemd[1]: libpod-cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759.scope: Deactivated successfully. Feb 23 04:49:16 localhost podman[303406]: 2026-02-23 09:49:16.609075506 +0000 UTC m=+0.081844272 container remove cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_greider, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.42.2, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:49:16 localhost systemd[1]: libpod-conmon-cd29203827fcbd15bb328840c656403d3a1d833b613782956c6e12ce401a1759.scope: Deactivated successfully. Feb 23 04:49:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:16 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:16 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:16 localhost ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)... Feb 23 04:49:16 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:49:16 localhost ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:49:16 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:16 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:16 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:49:16 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:49:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 23 04:49:16 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:16 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:16 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:49:16 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:49:17 localhost nova_compute[282206]: 2026-02-23 09:49:17.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:17 localhost podman[303482]: Feb 23 04:49:17 localhost podman[303482]: 2026-02-23 09:49:17.421679907 +0000 UTC m=+0.077536680 container create b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:49:17 localhost systemd[1]: tmp-crun.I5r2fT.mount: Deactivated successfully. Feb 23 04:49:17 localhost systemd[1]: var-lib-containers-storage-overlay-8179127daad17bf965845a3875a822603381bc2f529d8f3cdc6afa4736d2b6f3-merged.mount: Deactivated successfully. Feb 23 04:49:17 localhost systemd[1]: Started libpod-conmon-b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04.scope. Feb 23 04:49:17 localhost systemd[1]: Started libcrun container. Feb 23 04:49:17 localhost podman[303482]: 2026-02-23 09:49:17.481650903 +0000 UTC m=+0.137507686 container init b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, release=1770267347, GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux , RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:49:17 localhost podman[303482]: 2026-02-23 09:49:17.391621653 +0000 UTC m=+0.047478466 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:17 localhost podman[303482]: 2026-02-23 09:49:17.491650027 +0000 UTC m=+0.147506810 container start b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:49:17 localhost podman[303482]: 2026-02-23 09:49:17.491923965 +0000 UTC m=+0.147780778 container attach b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, CEPH_POINT_RELEASE=, version=7, ceph=True, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux , RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1770267347, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:49:17 localhost thirsty_nightingale[303497]: 167 167 Feb 23 04:49:17 localhost systemd[1]: libpod-b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04.scope: Deactivated successfully. Feb 23 04:49:17 localhost podman[303482]: 2026-02-23 09:49:17.494087101 +0000 UTC m=+0.149943874 container died b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64) Feb 23 04:49:17 localhost podman[303502]: 2026-02-23 09:49:17.591345341 +0000 UTC m=+0.085761111 container remove b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_nightingale, release=1770267347, ceph=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main) Feb 23 04:49:17 localhost systemd[1]: libpod-conmon-b92e1273ab597d7dfe6cd268790d1721bd07b01929b188365a427cc190026b04.scope: Deactivated successfully. Feb 23 04:49:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:17 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:17 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:17 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:49:17 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:49:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:49:17 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 23 04:49:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch Feb 23 04:49:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:17 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:49:17 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:49:17 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:17 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:17 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:49:17 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:17 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:49:17 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:17 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:17 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:17 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:18 localhost nova_compute[282206]: 2026-02-23 09:49:18.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:18 localhost nova_compute[282206]: 2026-02-23 09:49:18.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:18 localhost nova_compute[282206]: 2026-02-23 09:49:18.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:49:18 localhost nova_compute[282206]: 2026-02-23 09:49:18.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:18 localhost podman[303571]: Feb 23 04:49:18 localhost podman[303571]: 2026-02-23 09:49:18.263379415 +0000 UTC m=+0.074288423 container create e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, release=1770267347, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container) Feb 23 04:49:18 localhost systemd[1]: Started libpod-conmon-e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8.scope. Feb 23 04:49:18 localhost systemd[1]: Started libcrun container. Feb 23 04:49:18 localhost podman[303571]: 2026-02-23 09:49:18.319813772 +0000 UTC m=+0.130722810 container init e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, release=1770267347) Feb 23 04:49:18 localhost suspicious_joliot[303586]: 167 167 Feb 23 04:49:18 localhost podman[303571]: 2026-02-23 09:49:18.330950301 +0000 UTC m=+0.141859329 container start e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph) Feb 23 04:49:18 localhost podman[303571]: 2026-02-23 09:49:18.331555239 +0000 UTC m=+0.142464257 container attach e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, build-date=2026-02-09T10:25:24Z, release=1770267347, version=7, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:49:18 localhost podman[303571]: 2026-02-23 09:49:18.233091432 +0000 UTC m=+0.044000470 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:18 localhost systemd[1]: libpod-e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8.scope: Deactivated successfully. Feb 23 04:49:18 localhost podman[303571]: 2026-02-23 09:49:18.333915761 +0000 UTC m=+0.144824769 container died e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, architecture=x86_64, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, distribution-scope=public, name=rhceph, version=7, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Feb 23 04:49:18 localhost podman[303591]: 2026-02-23 09:49:18.426288532 +0000 UTC m=+0.079229062 container remove e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_joliot, GIT_BRANCH=main, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Feb 23 04:49:18 localhost systemd[1]: var-lib-containers-storage-overlay-89335beadee5d2918112b22abd8fd4f968d60c2807236c28273a7ecee2122fa4-merged.mount: Deactivated successfully. Feb 23 04:49:18 localhost systemd[1]: libpod-conmon-e95808fe281fa179cd8bf04e62a45de2e81ebda56cd753e6b5a7b3aa87a399f8.scope: Deactivated successfully. Feb 23 04:49:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:18 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:18 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:18 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:49:18 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:49:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:49:18 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:18 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:18 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:49:18 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:49:18 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:49:18 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:49:18 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:18 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:18 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:19 localhost nova_compute[282206]: 2026-02-23 09:49:19.063 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:19 localhost nova_compute[282206]: 2026-02-23 09:49:19.064 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:19 localhost nova_compute[282206]: 2026-02-23 09:49:19.064 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 23 04:49:19 localhost sshd[303608]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:49:19 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:19 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:19 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:19 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:19 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Feb 23 04:49:19 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Feb 23 04:49:19 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Feb 23 04:49:19 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:49:19 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:19 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:19 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:49:19 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:49:19 localhost nova_compute[282206]: 2026-02-23 09:49:19.807 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:19 localhost nova_compute[282206]: 2026-02-23 09:49:19.833 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 23 04:49:19 localhost nova_compute[282206]: 2026-02-23 09:49:19.834 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:49:19 localhost nova_compute[282206]: 2026-02-23 09:49:19.834 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:49:19 localhost ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:49:19 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:49:19 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:19 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:19 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:49:19 localhost nova_compute[282206]: 2026-02-23 09:49:19.859 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:49:19 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.210 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.210 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.211 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.211 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.212 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:49:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:20 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Feb 23 04:49:20 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Feb 23 04:49:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Feb 23 04:49:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:49:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:20 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:49:20 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:49:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:49:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4209324860' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.660 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.718 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.718 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.896 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.898 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11736MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.898 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:49:20 localhost nova_compute[282206]: 2026-02-23 09:49:20.898 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:49:20 localhost ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:49:20 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:49:20 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:20 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:20 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.040 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.041 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.041 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.095 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.112 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.114 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.114 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.114 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.137 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.138 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.159 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.160 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.183 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.210 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.247 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:49:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:21 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:21 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:21 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:49:21 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:49:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 23 04:49:21 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:21 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:21 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:49:21 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:49:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:49:21 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1220083815' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.699 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.708 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.730 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.733 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.733 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.734 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.734 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 23 04:49:21 localhost nova_compute[282206]: 2026-02-23 09:49:21.753 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 23 04:49:21 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:21 localhost ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:49:21 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:49:21 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:21 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:21 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:22 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:22 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:22 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:49:22 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:49:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:49:22 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 23 04:49:22 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch Feb 23 04:49:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:22 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:22 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:49:22 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:49:22 localhost nova_compute[282206]: 2026-02-23 09:49:22.753 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:49:22 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:49:22 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:49:22 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:22 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:22 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:23 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:49:23 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:49:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:49:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:23 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:49:23 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:49:23 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:23 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.54153 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005626466.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:49:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 23 04:49:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Feb 23 04:49:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:49:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:23 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:49:23 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:49:23 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:49:23 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:49:23 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:23 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:23 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:23 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:23 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:49:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:24 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:24 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:24 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Feb 23 04:49:24 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Feb 23 04:49:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Feb 23 04:49:24 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:49:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:24 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:24 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:49:24 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:49:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:49:24 localhost podman[303654]: 2026-02-23 09:49:24.919291875 +0000 UTC m=+0.090379782 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0) Feb 23 04:49:24 localhost podman[303654]: 2026-02-23 09:49:24.957317922 +0000 UTC m=+0.128405829 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:49:24 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:49:24 localhost ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:49:24 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:49:24 localhost ceph-mon[294160]: Deploying daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:49:24 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:24 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:24 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:49:25 localhost podman[303655]: 2026-02-23 09:49:24.962097848 +0000 UTC m=+0.131457272 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:49:25 localhost podman[303655]: 2026-02-23 09:49:25.045567649 +0000 UTC m=+0.214927033 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:49:25 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:49:25 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:25 localhost ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:49:25 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:26 localhost nova_compute[282206]: 2026-02-23 09:49:26.139 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:26 localhost nova_compute[282206]: 2026-02-23 09:49:26.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:26 localhost nova_compute[282206]: 2026-02-23 09:49:26.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:26 localhost nova_compute[282206]: 2026-02-23 09:49:26.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:26 localhost nova_compute[282206]: 2026-02-23 09:49:26.175 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:26 localhost nova_compute[282206]: 2026-02-23 09:49:26.175 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Feb 23 04:49:26 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).monmap v14 adding/updating np0005626466 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:26 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (2) No such file or directory Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626463"} v 0) Feb 23 04:49:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626463"} : dispatch Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626465"} v 0) Feb 23 04:49:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626465"} : dispatch Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:26 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:49:26 localhost ceph-mon[294160]: paxos.0).electionLogic(58) init, last seen epoch 58 Feb 23 04:49:26 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument Feb 23 04:49:26 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:49:27 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:27 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:27 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:27 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:27 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:27 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument Feb 23 04:49:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:49:28 localhost podman[303703]: 2026-02-23 09:49:28.910462166 +0000 UTC m=+0.082758130 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute) Feb 23 04:49:28 localhost podman[303703]: 2026-02-23 09:49:28.922342138 +0000 UTC m=+0.094638102 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0) Feb 23 04:49:28 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:28 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:28 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument Feb 23 04:49:28 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:49:29 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:29 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:29 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:29 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument Feb 23 04:49:30 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:30 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:30 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:30 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument Feb 23 04:49:31 localhost nova_compute[282206]: 2026-02-23 09:49:31.176 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:31 localhost nova_compute[282206]: 2026-02-23 09:49:31.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:31 localhost nova_compute[282206]: 2026-02-23 09:49:31.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:31 localhost nova_compute[282206]: 2026-02-23 09:49:31.179 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:31 localhost nova_compute[282206]: 2026-02-23 09:49:31.213 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:31 localhost nova_compute[282206]: 2026-02-23 09:49:31.213 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:31 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:31 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:31 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:31 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 is new leader, mons np0005626463,np0005626465 in quorum (ranks 0,1) Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : monmap epoch 15 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : last_changed 2026-02-23T09:49:26.924061+0000 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : created 2026-02-23T07:36:01.997603+0000 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626466 Feb 23 04:49:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e84: 6 total, 6 up, 6 in Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e32: np0005626463.wtksup(active, since 84s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005626463,np0005626465 (MON_DOWN) Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1/3 mons down, quorum np0005626463,np0005626465 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005626463,np0005626465 Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : mon.np0005626466 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:31 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:32 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Feb 23 04:49:32 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Feb 23 04:49:32 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:32 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:49:32 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626465 calling monitor election Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463 is new leader, mons np0005626463,np0005626465 in quorum (ranks 0,1) Feb 23 04:49:32 localhost ceph-mon[294160]: Health check failed: 1/3 mons down, quorum np0005626463,np0005626465 (MON_DOWN) Feb 23 04:49:32 localhost ceph-mon[294160]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005626463,np0005626465 Feb 23 04:49:32 localhost ceph-mon[294160]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005626463,np0005626465 Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626466 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Feb 23 04:49:32 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:32 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:32 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:49:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:49:32 localhost systemd[1]: tmp-crun.nogxYS.mount: Deactivated successfully. Feb 23 04:49:32 localhost podman[303722]: 2026-02-23 09:49:32.907057481 +0000 UTC m=+0.078981044 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:32 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:32 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:32 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:32 localhost podman[303722]: 2026-02-23 09:49:32.940347384 +0000 UTC m=+0.112271017 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 04:49:32 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:32 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:49:32 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005626466.vaywlp (monmap changed)... Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 23 04:49:32 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:32 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:49:32 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005626466.vaywlp on np0005626466.localdomain Feb 23 04:49:32 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:49:33 localhost ceph-mon[294160]: Reconfiguring osd.4 (monmap changed)... Feb 23 04:49:33 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:49:33 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:33 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:33 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626466.vaywlp", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:33 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:49:33 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:49:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626466.nisqfq", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr services"} v 0) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mgr services"} : dispatch Feb 23 04:49:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:33 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:49:33 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:49:33 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:33 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:33 localhost ceph-mgr[288036]: mgr finish mon failed to return metadata for mon.np0005626466: (22) Invalid argument Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 calling monitor election Feb 23 04:49:33 localhost ceph-mon[294160]: paxos.0).electionLogic(60) init, last seen epoch 60 Feb 23 04:49:33 localhost ceph-mon[294160]: mon.np0005626463@0(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : mon.np0005626463 is new leader, mons np0005626463,np0005626465,np0005626466 in quorum (ranks 0,1,2) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : monmap epoch 15 Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : fsid f1fea371-cb69-578d-a3d0-b5c472a84b46 Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : last_changed 2026-02-23T09:49:26.924061+0000 Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : created 2026-02-23T07:36:01.997603+0000 Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : election_strategy: 1 Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005626463 Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005626465 Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005626466 Feb 23 04:49:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005626463.qcthuc=up:active} 2 up:standby Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e84: 6 total, 6 up, 6 in Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e32: np0005626463.wtksup(active, since 86s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005626463,np0005626465) Feb 23 04:49:33 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Cluster is now healthy Feb 23 04:49:34 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : overall HEALTH_OK Feb 23 04:49:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:34 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:49:34 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626466.nisqfq (monmap changed)... Feb 23 04:49:34 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626466.nisqfq on np0005626466.localdomain Feb 23 04:49:34 localhost ceph-mon[294160]: mon.np0005626466 calling monitor election Feb 23 04:49:34 localhost ceph-mon[294160]: mon.np0005626463 calling monitor election Feb 23 04:49:34 localhost ceph-mon[294160]: mon.np0005626465 calling monitor election Feb 23 04:49:34 localhost ceph-mon[294160]: mon.np0005626463 is new leader, mons np0005626463,np0005626465,np0005626466 in quorum (ranks 0,1,2) Feb 23 04:49:34 localhost ceph-mon[294160]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005626463,np0005626465) Feb 23 04:49:34 localhost ceph-mon[294160]: Cluster is now healthy Feb 23 04:49:34 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 04:49:34 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:34 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:34 localhost ceph-mgr[288036]: mgr.server handle_open ignoring open from mon.np0005626466 172.18.0.108:0/53979952; not ready for session (expect reconnect) Feb 23 04:49:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005626466"} v 0) Feb 23 04:49:34 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "mon metadata", "id": "np0005626466"} : dispatch Feb 23 04:49:35 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:35 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:35.924+0000 7f44258e4640 -1 mgr.server handle_report got status from non-daemon mon.np0005626466 Feb 23 04:49:35 localhost ceph-mgr[288036]: mgr.server handle_report got status from non-daemon mon.np0005626466 Feb 23 04:49:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:36 localhost nova_compute[282206]: 2026-02-23 09:49:36.215 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:36 localhost nova_compute[282206]: 2026-02-23 09:49:36.216 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:36 localhost nova_compute[282206]: 2026-02-23 09:49:36.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:36 localhost nova_compute[282206]: 2026-02-23 09:49:36.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:36 localhost nova_compute[282206]: 2026-02-23 09:49:36.244 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:36 localhost nova_compute[282206]: 2026-02-23 09:49:36.245 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:36 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:36 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:36 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Feb 23 04:49:36 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:36 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:36 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:36 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:36 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:36 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:36 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:36 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:36 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:36 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:37 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:37 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:37 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:37 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:37 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:37 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:49:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mgr[288036]: [progress INFO root] update: starting ev 26d9902b-d853-4da6-8c6d-f48cb0605daa (Updating node-proxy deployment (+3 -> 3)) Feb 23 04:49:37 localhost ceph-mgr[288036]: [progress INFO root] complete: finished ev 26d9902b-d853-4da6-8c6d-f48cb0605daa (Updating node-proxy deployment (+3 -> 3)) Feb 23 04:49:37 localhost ceph-mgr[288036]: [progress INFO root] Completed event 26d9902b-d853-4da6-8c6d-f48cb0605daa (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Feb 23 04:49:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Feb 23 04:49:37 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Feb 23 04:49:37 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:37 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:37 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:37 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:37 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:37 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail Feb 23 04:49:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:49:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [] Feb 23 04:49:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:49:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', )] Feb 23 04:49:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs' Feb 23 04:49:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] scanning for idle connections.. Feb 23 04:49:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', )] Feb 23 04:49:38 localhost ceph-mgr[288036]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs' Feb 23 04:49:38 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:49:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:49:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:49:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:38 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:38 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:49:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:49:38 localhost podman[304197]: Feb 23 04:49:38 localhost podman[304197]: 2026-02-23 09:49:38.674968457 +0000 UTC m=+0.075425287 container create ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, distribution-scope=public, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z) Feb 23 04:49:38 localhost systemd[1]: Started libpod-conmon-ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a.scope. Feb 23 04:49:38 localhost systemd[1]: Started libcrun container. Feb 23 04:49:38 localhost podman[304197]: 2026-02-23 09:49:38.644225651 +0000 UTC m=+0.044682511 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:38 localhost podman[304197]: 2026-02-23 09:49:38.747947278 +0000 UTC m=+0.148404108 container init ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, release=1770267347, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:49:38 localhost podman[304197]: 2026-02-23 09:49:38.758651584 +0000 UTC m=+0.159108414 container start ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public) Feb 23 04:49:38 localhost podman[304197]: 2026-02-23 09:49:38.758910161 +0000 UTC m=+0.159367001 container attach ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, RELEASE=main, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, release=1770267347, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:49:38 localhost crazy_turing[304212]: 167 167 Feb 23 04:49:38 localhost systemd[1]: libpod-ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a.scope: Deactivated successfully. Feb 23 04:49:38 localhost podman[304197]: 2026-02-23 09:49:38.761695286 +0000 UTC m=+0.162152146 container died ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.openshift.expose-services=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , release=1770267347, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True) Feb 23 04:49:38 localhost podman[304217]: 2026-02-23 09:49:38.872778927 +0000 UTC m=+0.092074783 container remove ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_turing, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, io.buildah.version=1.42.2, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Feb 23 04:49:38 localhost systemd[1]: libpod-conmon-ae6fc89d35e4f5acef62cfadee5d0277c8724e4c84153035069e16b0f1182f1a.scope: Deactivated successfully. Feb 23 04:49:38 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:38 localhost ceph-mon[294160]: Reconfiguring crash.np0005626463 (monmap changed)... Feb 23 04:49:38 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626463", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:38 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626463 on np0005626463.localdomain Feb 23 04:49:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:38 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Feb 23 04:49:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Feb 23 04:49:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Feb 23 04:49:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:49:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:38 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:38 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:49:38 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:49:39 localhost ceph-mgr[288036]: log_channel(audit) log [DBG] : from='client.54166 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch Feb 23 04:49:39 localhost ceph-mgr[288036]: [cephadm INFO root] Reconfig service osd.default_drive_group Feb 23 04:49:39 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:39 localhost ceph-mgr[288036]: [progress INFO root] Writing back 50 completed events Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:39 localhost podman[242954]: time="2026-02-23T09:49:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:39 localhost podman[242954]: @ - - [23/Feb/2026:09:49:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:39 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost podman[242954]: @ - - [23/Feb/2026:09:49:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18278 "" "Go-http-client/1.1" Feb 23 04:49:39 localhost podman[304288]: Feb 23 04:49:39 localhost podman[304288]: 2026-02-23 09:49:39.61347653 +0000 UTC m=+0.077772469 container create a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, distribution-scope=public, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=1770267347, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, version=7, ceph=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:49:39 localhost systemd[1]: Started libpod-conmon-a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8.scope. Feb 23 04:49:39 localhost systemd[1]: Started libcrun container. Feb 23 04:49:39 localhost podman[304288]: 2026-02-23 09:49:39.58097838 +0000 UTC m=+0.045274369 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:39 localhost systemd[1]: var-lib-containers-storage-overlay-a06b2b25667b3e704ebf9634303fe6bc5679ded988ffc68f93a235fc21261233-merged.mount: Deactivated successfully. Feb 23 04:49:39 localhost podman[304288]: 2026-02-23 09:49:39.682435199 +0000 UTC m=+0.146731138 container init a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, architecture=x86_64) Feb 23 04:49:39 localhost podman[304288]: 2026-02-23 09:49:39.691154544 +0000 UTC m=+0.155450483 container start a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, io.openshift.expose-services=, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , name=rhceph, release=1770267347, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:49:39 localhost recursing_swartz[304303]: 167 167 Feb 23 04:49:39 localhost podman[304288]: 2026-02-23 09:49:39.691587997 +0000 UTC m=+0.155884016 container attach a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, vcs-type=git) Feb 23 04:49:39 localhost systemd[1]: libpod-a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8.scope: Deactivated successfully. Feb 23 04:49:39 localhost podman[304288]: 2026-02-23 09:49:39.695672741 +0000 UTC m=+0.159968720 container died a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, release=1770267347, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, version=7, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:49:39 localhost systemd[1]: var-lib-containers-storage-overlay-f30df2981ef37d9c237946931441128736a9ce4c64e09201c00980f67e92e2c1-merged.mount: Deactivated successfully. Feb 23 04:49:39 localhost podman[304308]: 2026-02-23 09:49:39.792491608 +0000 UTC m=+0.088221266 container remove a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_swartz, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , ceph=True, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, distribution-scope=public, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, name=rhceph, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph) Feb 23 04:49:39 localhost systemd[1]: libpod-conmon-a772a85bb2b71d6bffa76dd8deb6aada4bbe0adadb6fcc921ff7f88ea7a830f8.scope: Deactivated successfully. Feb 23 04:49:39 localhost ceph-mgr[288036]: log_channel(cluster) log [DBG] : pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: Reconfiguring osd.2 (monmap changed)... Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Feb 23 04:49:39 localhost ceph-mon[294160]: Reconfiguring daemon osd.2 on np0005626463.localdomain Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:39 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.034991) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180035055, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2166, "num_deletes": 252, "total_data_size": 3505561, "memory_usage": 3558216, "flush_reason": "Manual Compaction"} Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180046154, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 2398594, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17913, "largest_seqno": 20074, "table_properties": {"data_size": 2389183, "index_size": 5596, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 24651, "raw_average_key_size": 22, "raw_value_size": 2368421, "raw_average_value_size": 2158, "num_data_blocks": 248, "num_entries": 1097, "num_filter_entries": 1097, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840126, "oldest_key_time": 1771840126, "file_creation_time": 1771840180, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 11207 microseconds, and 5889 cpu microseconds. Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.046203) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 2398594 bytes OK Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.046226) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.048570) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.048607) EVENT_LOG_v1 {"time_micros": 1771840180048599, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.048628) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 3495127, prev total WAL file size 3511606, number of live WAL files 2. Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.049663) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end) Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(2342KB)], [30(17MB)] Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180049723, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 20316694, "oldest_snapshot_seqno": -1} Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11639 keys, 16330924 bytes, temperature: kUnknown Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180122652, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 16330924, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16262696, "index_size": 38047, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29125, "raw_key_size": 312649, "raw_average_key_size": 26, "raw_value_size": 16062192, "raw_average_value_size": 1380, "num_data_blocks": 1452, "num_entries": 11639, "num_filter_entries": 11639, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840180, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.122901) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 16330924 bytes Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.124449) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 278.4 rd, 223.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 17.1 +0.0 blob) out(15.6 +0.0 blob), read-write-amplify(15.3) write-amplify(6.8) OK, records in: 12175, records dropped: 536 output_compression: NoCompression Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.124467) EVENT_LOG_v1 {"time_micros": 1771840180124458, "job": 16, "event": "compaction_finished", "compaction_time_micros": 72982, "compaction_time_cpu_micros": 19274, "output_level": 6, "num_output_files": 1, "total_output_size": 16330924, "num_input_records": 12175, "num_output_records": 11639, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180124772, "job": 16, "event": "table_file_deletion", "file_number": 32} Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840180126550, "job": 16, "event": "table_file_deletion", "file_number": 30} Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.049568) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.126776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.126784) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.126787) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.126791) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:49:40 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:49:40.127023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:40 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Feb 23 04:49:40 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "config generate-minimal-conf"} : dispatch Feb 23 04:49:40 localhost ceph-mgr[288036]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:49:40 localhost ceph-mgr[288036]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e33: np0005626463.wtksup(active, since 92s), standbys: np0005626466.nisqfq, np0005626465.hlpkwo, np0005626461.lrfquh Feb 23 04:49:40 localhost podman[304383]: Feb 23 04:49:40 localhost podman[304383]: 2026-02-23 09:49:40.793602527 +0000 UTC m=+0.074470347 container create dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph) Feb 23 04:49:40 localhost systemd[1]: Started libpod-conmon-dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4.scope. Feb 23 04:49:40 localhost systemd[1]: Started libcrun container. Feb 23 04:49:40 localhost podman[304383]: 2026-02-23 09:49:40.856257334 +0000 UTC m=+0.137125144 container init dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, vcs-type=git, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, vendor=Red Hat, Inc., io.openshift.expose-services=, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7) Feb 23 04:49:40 localhost podman[304383]: 2026-02-23 09:49:40.762081557 +0000 UTC m=+0.042949377 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:40 localhost vigorous_lederberg[304398]: 167 167 Feb 23 04:49:40 localhost systemd[1]: libpod-dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4.scope: Deactivated successfully. Feb 23 04:49:40 localhost podman[304383]: 2026-02-23 09:49:40.869500036 +0000 UTC m=+0.150367846 container start dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, io.buildah.version=1.42.2, vcs-type=git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:49:40 localhost podman[304383]: 2026-02-23 09:49:40.86994933 +0000 UTC m=+0.150817180 container attach dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, release=1770267347, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 04:49:40 localhost podman[304383]: 2026-02-23 09:49:40.872390324 +0000 UTC m=+0.153258134 container died dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1770267347, maintainer=Guillaume Abrioux , distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0) Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1175127914' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e84 do_prune osdmap full prune enabled Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Activating manager daemon np0005626466.nisqfq Feb 23 04:49:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 e85: 6 total, 6 up, 6 in Feb 23 04:49:40 localhost ceph-mgr[288036]: mgr handle_mgr_map I was active but no longer am Feb 23 04:49:40 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:40.952+0000 7f4481af5640 -1 mgr handle_mgr_map I was active but no longer am Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e85: 6 total, 6 up, 6 in Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1175127914' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 23 04:49:40 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e34: np0005626466.nisqfq(active, starting, since 0.0428083s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh Feb 23 04:49:40 localhost podman[304403]: 2026-02-23 09:49:40.968724857 +0000 UTC m=+0.091001241 container remove dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigorous_lederberg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, version=7, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.) Feb 23 04:49:40 localhost systemd[1]: libpod-conmon-dbfdcfcd6065581a94de3f3d6c99eaf1e43230c85521f876d5c024b3b621e3e4.scope: Deactivated successfully. Feb 23 04:49:40 localhost systemd-logind[759]: Session 69 logged out. Waiting for processes to exit. Feb 23 04:49:41 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Manager daemon np0005626466.nisqfq is now available Feb 23 04:49:41 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: ignoring --setuser ceph since I am not root Feb 23 04:49:41 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: ignoring --setgroup ceph since I am not root Feb 23 04:49:41 localhost ceph-mgr[288036]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mgr, pid 2 Feb 23 04:49:41 localhost ceph-mgr[288036]: pidfile_write: ignore empty --pid-file Feb 23 04:49:41 localhost ceph-mon[294160]: Reconfig service osd.default_drive_group Feb 23 04:49:41 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:41 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:41 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:41 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' Feb 23 04:49:41 localhost ceph-mon[294160]: from='mgr.26614 172.18.0.106:0/483999791' entity='mgr.np0005626463.wtksup' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:49:41 localhost ceph-mon[294160]: from='client.? 172.18.0.200:0/1175127914' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:49:41 localhost ceph-mon[294160]: Activating manager daemon np0005626466.nisqfq Feb 23 04:49:41 localhost ceph-mon[294160]: from='client.? 172.18.0.200:0/1175127914' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 23 04:49:41 localhost ceph-mon[294160]: Manager daemon np0005626466.nisqfq is now available Feb 23 04:49:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} v 0) Feb 23 04:49:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch Feb 23 04:49:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"}]': finished Feb 23 04:49:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} v 0) Feb 23 04:49:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch Feb 23 04:49:41 localhost ceph-mgr[288036]: mgr[py] Loading python module 'alerts' Feb 23 04:49:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"}]': finished Feb 23 04:49:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:41 localhost systemd[1]: session-69.scope: Deactivated successfully. Feb 23 04:49:41 localhost systemd[1]: session-69.scope: Consumed 23.429s CPU time. Feb 23 04:49:41 localhost systemd-logind[759]: Removed session 69. Feb 23 04:49:41 localhost ceph-mgr[288036]: mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 23 04:49:41 localhost ceph-mgr[288036]: mgr[py] Loading python module 'balancer' Feb 23 04:49:41 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:41.151+0000 7f6fbef71140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Feb 23 04:49:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/mirror_snapshot_schedule"} v 0) Feb 23 04:49:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/mirror_snapshot_schedule"} : dispatch Feb 23 04:49:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/trash_purge_schedule"} v 0) Feb 23 04:49:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/trash_purge_schedule"} : dispatch Feb 23 04:49:41 localhost ceph-mgr[288036]: mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 23 04:49:41 localhost ceph-mgr[288036]: mgr[py] Loading python module 'cephadm' Feb 23 04:49:41 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:41.226+0000 7f6fbef71140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Feb 23 04:49:41 localhost nova_compute[282206]: 2026-02-23 09:49:41.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:41 localhost nova_compute[282206]: 2026-02-23 09:49:41.248 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:41 localhost nova_compute[282206]: 2026-02-23 09:49:41.248 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:41 localhost nova_compute[282206]: 2026-02-23 09:49:41.248 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:41 localhost nova_compute[282206]: 2026-02-23 09:49:41.285 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:41 localhost nova_compute[282206]: 2026-02-23 09:49:41.286 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:41 localhost sshd[304449]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:49:41 localhost systemd-logind[759]: New session 70 of user ceph-admin. Feb 23 04:49:41 localhost systemd[1]: Started Session 70 of User ceph-admin. Feb 23 04:49:41 localhost systemd[1]: var-lib-containers-storage-overlay-c4033d6ce700523fb971bc451b1396ff9a33dd68e76f08a48a0dd0cb5f2d245c-merged.mount: Deactivated successfully. Feb 23 04:49:41 localhost ceph-mgr[288036]: mgr[py] Loading python module 'crash' Feb 23 04:49:41 localhost ceph-mgr[288036]: mgr[py] Module crash has missing NOTIFY_TYPES member Feb 23 04:49:41 localhost ceph-mgr[288036]: mgr[py] Loading python module 'dashboard' Feb 23 04:49:41 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:41.945+0000 7f6fbef71140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Feb 23 04:49:42 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e35: np0005626466.nisqfq(active, since 1.07954s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh Feb 23 04:49:42 localhost ceph-mon[294160]: removing stray HostCache host record np0005626461.localdomain.devices.0 Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"}]': finished Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"} : dispatch Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005626461.localdomain.devices.0"}]': finished Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/mirror_snapshot_schedule"} : dispatch Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/mirror_snapshot_schedule"} : dispatch Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/trash_purge_schedule"} : dispatch Feb 23 04:49:42 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626466.nisqfq/trash_purge_schedule"} : dispatch Feb 23 04:49:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:49:42 localhost podman[304537]: 2026-02-23 09:49:42.303917813 +0000 UTC m=+0.083886544 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:49:42 localhost podman[304537]: 2026-02-23 09:49:42.339161816 +0000 UTC m=+0.119130577 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:49:42 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Loading python module 'devicehealth' Feb 23 04:49:42 localhost podman[304588]: 2026-02-23 09:49:42.528652843 +0000 UTC m=+0.091280029 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Feb 23 04:49:42 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:42.538+0000 7f6fbef71140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Loading python module 'diskprediction_local' Feb 23 04:49:42 localhost podman[304588]: 2026-02-23 09:49:42.662416674 +0000 UTC m=+0.225043860 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, RELEASE=main) Feb 23 04:49:42 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Feb 23 04:49:42 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Feb 23 04:49:42 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: from numpy import show_config as show_numpy_config Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 23 04:49:42 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:42.673+0000 7f6fbef71140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Loading python module 'influx' Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Module influx has missing NOTIFY_TYPES member Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Loading python module 'insights' Feb 23 04:49:42 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:42.731+0000 7f6fbef71140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Loading python module 'iostat' Feb 23 04:49:42 localhost systemd[1]: tmp-crun.b51AMX.mount: Deactivated successfully. Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 23 04:49:42 localhost ceph-mgr[288036]: mgr[py] Loading python module 'k8sevents' Feb 23 04:49:42 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:42.845+0000 7f6fbef71140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e36: np0005626466.nisqfq(active, since 2s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'localpool' Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'mds_autoscaler' Feb 23 04:49:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:43 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:43 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:43 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'mirroring' Feb 23 04:49:43 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:43 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:43 localhost openstack_network_exporter[245358]: ERROR 09:49:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:49:43 localhost openstack_network_exporter[245358]: Feb 23 04:49:43 localhost openstack_network_exporter[245358]: ERROR 09:49:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:49:43 localhost openstack_network_exporter[245358]: Feb 23 04:49:43 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'nfs' Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'orchestrator' Feb 23 04:49:43 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.575+0000 7f6fbef71140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'osd_perf_query' Feb 23 04:49:43 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.718+0000 7f6fbef71140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'osd_support' Feb 23 04:49:43 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.781+0000 7f6fbef71140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'pg_autoscaler' Feb 23 04:49:43 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.837+0000 7f6fbef71140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'progress' Feb 23 04:49:43 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.903+0000 7f6fbef71140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Module progress has missing NOTIFY_TYPES member Feb 23 04:49:43 localhost ceph-mgr[288036]: mgr[py] Loading python module 'prometheus' Feb 23 04:49:43 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:43.963+0000 7f6fbef71140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Feb 23 04:49:44 localhost ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Bus STARTING Feb 23 04:49:44 localhost ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Serving on http://172.18.0.108:8765 Feb 23 04:49:44 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Serving on https://172.18.0.108:7150 Feb 23 04:49:44 localhost ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Bus STARTED Feb 23 04:49:44 localhost ceph-mon[294160]: [23/Feb/2026:09:49:43] ENGINE Client ('172.18.0.108', 60904) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 23 04:49:44 localhost ceph-mgr[288036]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 23 04:49:44 localhost ceph-mgr[288036]: mgr[py] Loading python module 'rbd_support' Feb 23 04:49:44 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:44.258+0000 7f6fbef71140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Feb 23 04:49:44 localhost ceph-mgr[288036]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 23 04:49:44 localhost ceph-mgr[288036]: mgr[py] Loading python module 'restful' Feb 23 04:49:44 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:44.338+0000 7f6fbef71140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Feb 23 04:49:44 localhost ceph-mgr[288036]: mgr[py] Loading python module 'rgw' Feb 23 04:49:44 localhost ceph-mgr[288036]: mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 23 04:49:44 localhost ceph-mgr[288036]: mgr[py] Loading python module 'rook' Feb 23 04:49:44 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:44.655+0000 7f6fbef71140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Feb 23 04:49:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:49:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Module rook has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'selftest' Feb 23 04:49:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.055+0000 7f6fbef71140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'snap_schedule' Feb 23 04:49:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.116+0000 7f6fbef71140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e37: np0005626466.nisqfq(active, since 4s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'stats' Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'status' Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Module status has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'telegraf' Feb 23 04:49:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.305+0000 7f6fbef71140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'telemetry' Feb 23 04:49:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.362+0000 7f6fbef71140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'test_orchestrator' Feb 23 04:49:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.491+0000 7f6fbef71140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost podman[304953]: 2026-02-23 09:49:45.504906184 +0000 UTC m=+0.078760417 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Feb 23 04:49:45 localhost podman[304953]: 2026-02-23 09:49:45.522382216 +0000 UTC m=+0.096236489 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal) Feb 23 04:49:45 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'volumes' Feb 23 04:49:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.638+0000 7f6fbef71140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 04:49:45 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 04:49:45 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 04:49:45 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:49:45 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:49:45 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:45 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:45 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:49:45 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Loading python module 'zabbix' Feb 23 04:49:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.837+0000 7f6fbef71140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mgr[288036]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-mgr-np0005626463-wtksup[288032]: 2026-02-23T09:49:45.900+0000 7f6fbef71140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Feb 23 04:49:45 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : Standby manager daemon np0005626463.wtksup started Feb 23 04:49:45 localhost ceph-mgr[288036]: ms_deliver_dispatch: unhandled message 0x559b75999600 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Feb 23 04:49:45 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.108:6810/1471406 Feb 23 04:49:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:46 localhost nova_compute[282206]: 2026-02-23 09:49:46.291 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:46 localhost nova_compute[282206]: 2026-02-23 09:49:46.296 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:46 localhost nova_compute[282206]: 2026-02-23 09:49:46.298 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5012 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:46 localhost nova_compute[282206]: 2026-02-23 09:49:46.298 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:46 localhost nova_compute[282206]: 2026-02-23 09:49:46.324 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:46 localhost nova_compute[282206]: 2026-02-23 09:49:46.325 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:46 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:46 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:49:46 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:49:46 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:49:46 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:49:46 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.108:6810/1471406 Feb 23 04:49:46 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e38: np0005626466.nisqfq(active, since 6s), standbys: np0005626465.hlpkwo, np0005626461.lrfquh, np0005626463.wtksup Feb 23 04:49:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 23 04:49:47 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 23 04:49:47 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:49:47 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:49:47 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:49:47 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:47 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:49:48.552 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:49:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:49:48.552 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:49:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:49:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:49:48 localhost podman[305561]: Feb 23 04:49:48 localhost podman[305561]: 2026-02-23 09:49:48.701280336 +0000 UTC m=+0.083153462 container create 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1770267347, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:49:48 localhost systemd[1]: Started libpod-conmon-1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c.scope. Feb 23 04:49:48 localhost systemd[1]: Started libcrun container. Feb 23 04:49:48 localhost podman[305561]: 2026-02-23 09:49:48.66758125 +0000 UTC m=+0.049454466 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:48 localhost podman[305561]: 2026-02-23 09:49:48.779229698 +0000 UTC m=+0.161102864 container init 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, version=7, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Feb 23 04:49:48 localhost podman[305561]: 2026-02-23 09:49:48.792527603 +0000 UTC m=+0.174400769 container start 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, ceph=True, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:49:48 localhost podman[305561]: 2026-02-23 09:49:48.79309854 +0000 UTC m=+0.174971706 container attach 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=) Feb 23 04:49:48 localhost silly_babbage[305576]: 167 167 Feb 23 04:49:48 localhost systemd[1]: libpod-1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c.scope: Deactivated successfully. Feb 23 04:49:48 localhost podman[305561]: 2026-02-23 09:49:48.797325819 +0000 UTC m=+0.179199055 container died 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, version=7, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , architecture=x86_64, distribution-scope=public, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:49:48 localhost podman[305581]: 2026-02-23 09:49:48.896565619 +0000 UTC m=+0.090667821 container remove 1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_babbage, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, RELEASE=main, architecture=x86_64, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vendor=Red Hat, Inc.) Feb 23 04:49:48 localhost systemd[1]: libpod-conmon-1b80f8230a511807e7c8a32c0a9a6dae304e82209e65f6aed84aec8df6ff376c.scope: Deactivated successfully. Feb 23 04:49:48 localhost ceph-mon[294160]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Feb 23 04:49:48 localhost ceph-mon[294160]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Feb 23 04:49:48 localhost ceph-mon[294160]: Reconfiguring osd.5 (monmap changed)... Feb 23 04:49:48 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Feb 23 04:49:48 localhost ceph-mon[294160]: Reconfiguring daemon osd.5 on np0005626463.localdomain Feb 23 04:49:48 localhost sshd[305600]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:49:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:49 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:49 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:49 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:49 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 23 04:49:49 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:49 localhost systemd[1]: var-lib-containers-storage-overlay-3e5dcd73db399b7bd165e48ba18e6960b89c64b2e1269e4fe0b6a9d247aa6147-merged.mount: Deactivated successfully. Feb 23 04:49:49 localhost podman[305660]: Feb 23 04:49:49 localhost podman[305660]: 2026-02-23 09:49:49.868051526 +0000 UTC m=+0.082874353 container create 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, release=1770267347, name=rhceph, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., RELEASE=main) Feb 23 04:49:49 localhost systemd[1]: Started libpod-conmon-70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708.scope. Feb 23 04:49:49 localhost systemd[1]: Started libcrun container. Feb 23 04:49:49 localhost podman[305660]: 2026-02-23 09:49:49.83436269 +0000 UTC m=+0.049185557 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:49 localhost podman[305660]: 2026-02-23 09:49:49.940916043 +0000 UTC m=+0.155738880 container init 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2) Feb 23 04:49:49 localhost podman[305660]: 2026-02-23 09:49:49.951537807 +0000 UTC m=+0.166360644 container start 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, name=rhceph, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:49:49 localhost podman[305660]: 2026-02-23 09:49:49.951835336 +0000 UTC m=+0.166658163 container attach 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, architecture=x86_64, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:49:49 localhost sweet_lewin[305675]: 167 167 Feb 23 04:49:49 localhost systemd[1]: libpod-70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708.scope: Deactivated successfully. Feb 23 04:49:49 localhost podman[305660]: 2026-02-23 09:49:49.956487348 +0000 UTC m=+0.171310205 container died 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=1770267347, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14) Feb 23 04:49:50 localhost podman[305680]: 2026-02-23 09:49:50.096006804 +0000 UTC m=+0.089111013 container remove 70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_lewin, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, version=7, architecture=x86_64) Feb 23 04:49:50 localhost systemd[1]: libpod-conmon-70be4ef4e3ad862e9be560db25c76d33459eb2a8d487dcd484c6df0e76d78708.scope: Deactivated successfully. Feb 23 04:49:50 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:50 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:50 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:50 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:50 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626463.qcthuc (monmap changed)... Feb 23 04:49:50 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:50 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626463.qcthuc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:50 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626463.qcthuc on np0005626463.localdomain Feb 23 04:49:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:50 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:50 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:49:50 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:50 localhost systemd[1]: var-lib-containers-storage-overlay-2a3256188aa71ccd5d56dce7a4a0ab2c4e98074d8d862e0e18879f63fd382dac-merged.mount: Deactivated successfully. Feb 23 04:49:50 localhost podman[305750]: Feb 23 04:49:50 localhost podman[305750]: 2026-02-23 09:49:50.868902607 +0000 UTC m=+0.076478379 container create 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.expose-services=, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, name=rhceph, GIT_BRANCH=main) Feb 23 04:49:50 localhost systemd[1]: Started libpod-conmon-74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7.scope. Feb 23 04:49:50 localhost systemd[1]: Started libcrun container. Feb 23 04:49:50 localhost podman[305750]: 2026-02-23 09:49:50.843190595 +0000 UTC m=+0.050766467 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:49:50 localhost podman[305750]: 2026-02-23 09:49:50.95115651 +0000 UTC m=+0.158732292 container init 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, version=7, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph) Feb 23 04:49:50 localhost podman[305750]: 2026-02-23 09:49:50.96331845 +0000 UTC m=+0.170894222 container start 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, GIT_BRANCH=main, release=1770267347, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.buildah.version=1.42.2) Feb 23 04:49:50 localhost podman[305750]: 2026-02-23 09:49:50.964069434 +0000 UTC m=+0.171645236 container attach 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, ceph=True, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, distribution-scope=public) Feb 23 04:49:50 localhost serene_agnesi[305765]: 167 167 Feb 23 04:49:50 localhost systemd[1]: libpod-74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7.scope: Deactivated successfully. Feb 23 04:49:50 localhost podman[305750]: 2026-02-23 09:49:50.969178289 +0000 UTC m=+0.176754131 container died 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, RELEASE=main, description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:49:51 localhost podman[305770]: 2026-02-23 09:49:51.071687299 +0000 UTC m=+0.089576388 container remove 74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_agnesi, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Feb 23 04:49:51 localhost systemd[1]: libpod-conmon-74b83f61184b317e21feca4fece2b1cd5d3e98349045c9755116c4f4cb6a8cb7.scope: Deactivated successfully. Feb 23 04:49:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:49:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:49:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:49:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:49:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:51 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:51 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:51 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626463.wtksup (monmap changed)... Feb 23 04:49:51 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:51 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626463.wtksup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:51 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626463.wtksup on np0005626463.localdomain Feb 23 04:49:51 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:51 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:51 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:51 localhost nova_compute[282206]: 2026-02-23 09:49:51.325 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:51 localhost nova_compute[282206]: 2026-02-23 09:49:51.328 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:51 localhost nova_compute[282206]: 2026-02-23 09:49:51.328 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:51 localhost nova_compute[282206]: 2026-02-23 09:49:51.328 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:51 localhost nova_compute[282206]: 2026-02-23 09:49:51.358 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:51 localhost nova_compute[282206]: 2026-02-23 09:49:51.358 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:51 localhost sshd[305787]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:49:51 localhost systemd[1]: var-lib-containers-storage-overlay-6a3b81da0c89bdf3142d4296e702090d78b44bc3007472fbd1e2db96cf492f8c-merged.mount: Deactivated successfully. Feb 23 04:49:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:52 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:52 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:52 localhost ceph-mon[294160]: Reconfiguring crash.np0005626465 (monmap changed)... Feb 23 04:49:52 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:52 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626465", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:52 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626465 on np0005626465.localdomain Feb 23 04:49:52 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:52 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:53 localhost ceph-mon[294160]: Reconfiguring osd.0 (monmap changed)... Feb 23 04:49:53 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Feb 23 04:49:53 localhost ceph-mon[294160]: Reconfiguring daemon osd.0 on np0005626465.localdomain Feb 23 04:49:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: Reconfiguring osd.3 (monmap changed)... Feb 23 04:49:55 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Feb 23 04:49:55 localhost ceph-mon[294160]: Reconfiguring daemon osd.3 on np0005626465.localdomain Feb 23 04:49:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:55 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:55 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:55 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:55 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Feb 23 04:49:55 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:49:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:49:55 localhost podman[305790]: 2026-02-23 09:49:55.920469781 +0000 UTC m=+0.087970048 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:49:55 localhost podman[305790]: 2026-02-23 09:49:55.935256771 +0000 UTC m=+0.102757008 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:49:55 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:49:56 localhost podman[305789]: 2026-02-23 09:49:56.025126946 +0000 UTC m=+0.192874031 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 23 04:49:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:56 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:56 localhost podman[305789]: 2026-02-23 09:49:56.115767486 +0000 UTC m=+0.283514591 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:49:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:56 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:49:56 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:49:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Feb 23 04:49:56 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:56 localhost ceph-mon[294160]: Reconfiguring mds.mds.np0005626465.drvnoy (monmap changed)... Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005626465.drvnoy", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Feb 23 04:49:56 localhost ceph-mon[294160]: Reconfiguring daemon mds.mds.np0005626465.drvnoy on np0005626465.localdomain Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:56 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005626465.hlpkwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Feb 23 04:49:56 localhost nova_compute[282206]: 2026-02-23 09:49:56.359 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:49:56 localhost nova_compute[282206]: 2026-02-23 09:49:56.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:56 localhost nova_compute[282206]: 2026-02-23 09:49:56.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:49:56 localhost nova_compute[282206]: 2026-02-23 09:49:56.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:56 localhost nova_compute[282206]: 2026-02-23 09:49:56.362 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:49:56 localhost nova_compute[282206]: 2026-02-23 09:49:56.365 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:49:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:49:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:49:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Feb 23 04:49:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:57 localhost ceph-mon[294160]: Reconfiguring mgr.np0005626465.hlpkwo (monmap changed)... Feb 23 04:49:57 localhost ceph-mon[294160]: Reconfiguring daemon mgr.np0005626465.hlpkwo on np0005626465.localdomain Feb 23 04:49:57 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:57 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:57 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:57 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005626466", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Feb 23 04:49:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 23 04:49:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:58 localhost ceph-mon[294160]: Reconfiguring crash.np0005626466 (monmap changed)... Feb 23 04:49:58 localhost ceph-mon[294160]: Reconfiguring daemon crash.np0005626466 on np0005626466.localdomain Feb 23 04:49:58 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:58 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:58 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:58 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Feb 23 04:49:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:58 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:59 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:59 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:49:59 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:59 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:49:59 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:59 localhost ceph-mon[294160]: Saving service mon spec with placement label:mon Feb 23 04:49:59 localhost ceph-mon[294160]: Reconfiguring osd.1 (monmap changed)... Feb 23 04:49:59 localhost ceph-mon[294160]: Reconfiguring daemon osd.1 on np0005626466.localdomain Feb 23 04:49:59 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:59 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:59 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:59 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:49:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:49:59 localhost podman[305838]: 2026-02-23 09:49:59.924497373 +0000 UTC m=+0.101875061 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 04:49:59 localhost podman[305838]: 2026-02-23 09:49:59.935015043 +0000 UTC m=+0.112392761 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 04:49:59 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005626461.lrfquh on host np0005626461.localdomain not managed by cephadm Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(cluster) log [WRN] : stray host np0005626461.localdomain has 1 stray daemons: ['mgr.np0005626461.lrfquh'] Feb 23 04:50:00 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Feb 23 04:50:00 localhost ceph-mon[294160]: Reconfiguring daemon osd.4 on np0005626466.localdomain Feb 23 04:50:00 localhost ceph-mon[294160]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 23 04:50:00 localhost ceph-mon[294160]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Feb 23 04:50:00 localhost ceph-mon[294160]: stray daemon mgr.np0005626461.lrfquh on host np0005626461.localdomain not managed by cephadm Feb 23 04:50:00 localhost ceph-mon[294160]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Feb 23 04:50:00 localhost ceph-mon[294160]: stray host np0005626461.localdomain has 1 stray daemons: ['mgr.np0005626461.lrfquh'] Feb 23 04:50:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:50:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:01 localhost nova_compute[282206]: 2026-02-23 09:50:01.366 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:01 localhost nova_compute[282206]: 2026-02-23 09:50:01.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:01 localhost nova_compute[282206]: 2026-02-23 09:50:01.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:01 localhost nova_compute[282206]: 2026-02-23 09:50:01.369 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:01 localhost nova_compute[282206]: 2026-02-23 09:50:01.392 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:01 localhost nova_compute[282206]: 2026-02-23 09:50:01.393 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:50:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:50:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:50:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Feb 23 04:50:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: Reconfiguring mon.np0005626466 (monmap changed)... Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:50:01 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626466 on np0005626466.localdomain Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:01 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:02 localhost podman[305928]: Feb 23 04:50:02 localhost podman[305928]: 2026-02-23 09:50:02.431607397 +0000 UTC m=+0.083866324 container create 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:50:02 localhost systemd[1]: Started libpod-conmon-33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68.scope. Feb 23 04:50:02 localhost podman[305928]: 2026-02-23 09:50:02.398743296 +0000 UTC m=+0.051002263 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Feb 23 04:50:02 localhost systemd[1]: Started libcrun container. Feb 23 04:50:02 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:50:02 localhost podman[305928]: 2026-02-23 09:50:02.516967185 +0000 UTC m=+0.169226112 container init 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, ceph=True, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux , io.buildah.version=1.42.2, distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64) Feb 23 04:50:02 localhost podman[305928]: 2026-02-23 09:50:02.562533612 +0000 UTC m=+0.214792539 container start 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, RELEASE=main, release=1770267347, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:50:02 localhost podman[305928]: 2026-02-23 09:50:02.563050197 +0000 UTC m=+0.215309174 container attach 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1770267347, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, architecture=x86_64, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:50:02 localhost jovial_wilson[305943]: 167 167 Feb 23 04:50:02 localhost systemd[1]: libpod-33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68.scope: Deactivated successfully. Feb 23 04:50:02 localhost podman[305928]: 2026-02-23 09:50:02.566534693 +0000 UTC m=+0.218793620 container died 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , ceph=True, GIT_BRANCH=main, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z) Feb 23 04:50:02 localhost podman[305948]: 2026-02-23 09:50:02.674784818 +0000 UTC m=+0.091691952 container remove 33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jovial_wilson, maintainer=Guillaume Abrioux , version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-type=git, RELEASE=main, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc.) Feb 23 04:50:02 localhost systemd[1]: libpod-conmon-33fd161f6b5f5cc101635e6b730c27bb93d088aeabccce0eef145f2117fe3f68.scope: Deactivated successfully. Feb 23 04:50:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:50:02 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:50:02 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:50:03 localhost podman[305965]: 2026-02-23 09:50:03.419844205 +0000 UTC m=+0.091861417 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:50:03 localhost systemd[1]: var-lib-containers-storage-overlay-87234b8af18ff0dfc55697c9b7c7ba4949dd82605185e6b8db0eda3121ff4a5a-merged.mount: Deactivated successfully. Feb 23 04:50:03 localhost podman[305965]: 2026-02-23 09:50:03.455286254 +0000 UTC m=+0.127303456 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible) Feb 23 04:50:03 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:50:03 localhost ceph-mon[294160]: Reconfiguring mon.np0005626463 (monmap changed)... Feb 23 04:50:03 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626463 on np0005626463.localdomain Feb 23 04:50:03 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:03 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:03 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Feb 23 04:50:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:50:03 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:50:03 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:04 localhost ceph-mon[294160]: Reconfiguring mon.np0005626465 (monmap changed)... Feb 23 04:50:04 localhost ceph-mon[294160]: Reconfiguring daemon mon.np0005626465 on np0005626465.localdomain Feb 23 04:50:04 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:04 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:50:06 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e39: np0005626466.nisqfq(active, since 25s), standbys: np0005626465.hlpkwo, np0005626463.wtksup Feb 23 04:50:06 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:06 localhost nova_compute[282206]: 2026-02-23 09:50:06.394 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:06 localhost nova_compute[282206]: 2026-02-23 09:50:06.396 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:06 localhost nova_compute[282206]: 2026-02-23 09:50:06.396 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:06 localhost nova_compute[282206]: 2026-02-23 09:50:06.396 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:06 localhost nova_compute[282206]: 2026-02-23 09:50:06.397 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:06 localhost nova_compute[282206]: 2026-02-23 09:50:06.398 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:07 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:50:09 localhost podman[242954]: time="2026-02-23T09:50:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:50:09 localhost podman[242954]: @ - - [23/Feb/2026:09:50:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:50:09 localhost podman[242954]: @ - - [23/Feb/2026:09:50:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18291 "" "Go-http-client/1.1" Feb 23 04:50:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:11 localhost nova_compute[282206]: 2026-02-23 09:50:11.402 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:11 localhost nova_compute[282206]: 2026-02-23 09:50:11.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:11 localhost nova_compute[282206]: 2026-02-23 09:50:11.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:11 localhost nova_compute[282206]: 2026-02-23 09:50:11.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:11 localhost nova_compute[282206]: 2026-02-23 09:50:11.429 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:11 localhost nova_compute[282206]: 2026-02-23 09:50:11.429 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:50:12 localhost podman[305984]: 2026-02-23 09:50:12.915776172 +0000 UTC m=+0.091324761 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:50:12 localhost podman[305984]: 2026-02-23 09:50:12.934205082 +0000 UTC m=+0.109753681 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:50:12 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:50:13 localhost openstack_network_exporter[245358]: ERROR 09:50:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:50:13 localhost openstack_network_exporter[245358]: Feb 23 04:50:13 localhost openstack_network_exporter[245358]: ERROR 09:50:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:50:13 localhost openstack_network_exporter[245358]: Feb 23 04:50:14 localhost nova_compute[282206]: 2026-02-23 09:50:14.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:14 localhost nova_compute[282206]: 2026-02-23 09:50:14.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:50:14 localhost nova_compute[282206]: 2026-02-23 09:50:14.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:50:14 localhost nova_compute[282206]: 2026-02-23 09:50:14.913 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:50:14 localhost nova_compute[282206]: 2026-02-23 09:50:14.913 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:50:14 localhost nova_compute[282206]: 2026-02-23 09:50:14.914 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:50:14 localhost nova_compute[282206]: 2026-02-23 09:50:14.914 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:50:15 localhost nova_compute[282206]: 2026-02-23 09:50:15.458 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:50:15 localhost nova_compute[282206]: 2026-02-23 09:50:15.478 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:50:15 localhost nova_compute[282206]: 2026-02-23 09:50:15.478 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:50:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:50:15 localhost podman[306006]: 2026-02-23 09:50:15.918403847 +0000 UTC m=+0.080894004 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, release=1770267347, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, version=9.7) Feb 23 04:50:15 localhost podman[306006]: 2026-02-23 09:50:15.957655391 +0000 UTC m=+0.120145558 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, release=1770267347, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible) Feb 23 04:50:15 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:50:16 localhost nova_compute[282206]: 2026-02-23 09:50:16.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:16 localhost nova_compute[282206]: 2026-02-23 09:50:16.430 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:16 localhost nova_compute[282206]: 2026-02-23 09:50:16.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:16 localhost nova_compute[282206]: 2026-02-23 09:50:16.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:16 localhost nova_compute[282206]: 2026-02-23 09:50:16.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:16 localhost nova_compute[282206]: 2026-02-23 09:50:16.465 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:16 localhost nova_compute[282206]: 2026-02-23 09:50:16.465 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:19 localhost nova_compute[282206]: 2026-02-23 09:50:19.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:19 localhost nova_compute[282206]: 2026-02-23 09:50:19.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:19 localhost nova_compute[282206]: 2026-02-23 09:50:19.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:50:20 localhost nova_compute[282206]: 2026-02-23 09:50:20.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:20 localhost nova_compute[282206]: 2026-02-23 09:50:20.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:21 localhost nova_compute[282206]: 2026-02-23 09:50:21.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:21 localhost nova_compute[282206]: 2026-02-23 09:50:21.466 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:21 localhost nova_compute[282206]: 2026-02-23 09:50:21.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:21 localhost nova_compute[282206]: 2026-02-23 09:50:21.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:21 localhost nova_compute[282206]: 2026-02-23 09:50:21.469 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:21 localhost nova_compute[282206]: 2026-02-23 09:50:21.500 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:21 localhost nova_compute[282206]: 2026-02-23 09:50:21.500 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.080 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.080 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.081 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.081 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.081 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:50:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:50:22 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1980972140' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.530 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.590 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.590 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.834 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.836 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11733MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.836 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.837 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.949 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.949 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.950 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:50:22 localhost nova_compute[282206]: 2026-02-23 09:50:22.999 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:50:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:50:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2213743575' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:50:23 localhost nova_compute[282206]: 2026-02-23 09:50:23.454 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:50:23 localhost nova_compute[282206]: 2026-02-23 09:50:23.461 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:50:23 localhost nova_compute[282206]: 2026-02-23 09:50:23.479 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:50:23 localhost nova_compute[282206]: 2026-02-23 09:50:23.481 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:50:23 localhost nova_compute[282206]: 2026-02-23 09:50:23.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:50:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:26 localhost nova_compute[282206]: 2026-02-23 09:50:26.535 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:26 localhost nova_compute[282206]: 2026-02-23 09:50:26.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:26 localhost nova_compute[282206]: 2026-02-23 09:50:26.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:26 localhost nova_compute[282206]: 2026-02-23 09:50:26.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:26 localhost nova_compute[282206]: 2026-02-23 09:50:26.538 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:26 localhost nova_compute[282206]: 2026-02-23 09:50:26.543 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:50:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:50:26 localhost podman[306070]: 2026-02-23 09:50:26.916680118 +0000 UTC m=+0.086299108 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.build-date=20260216) Feb 23 04:50:26 localhost podman[306070]: 2026-02-23 09:50:26.958305604 +0000 UTC m=+0.127924624 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true) Feb 23 04:50:26 localhost podman[306071]: 2026-02-23 09:50:26.967929537 +0000 UTC m=+0.134331859 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:50:26 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:50:26 localhost podman[306071]: 2026-02-23 09:50:26.981297534 +0000 UTC m=+0.147699896 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:50:26 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:50:29 localhost sshd[306118]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:50:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:50:30 localhost podman[306120]: 2026-02-23 09:50:30.423187438 +0000 UTC m=+0.083181151 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:50:30 localhost podman[306120]: 2026-02-23 09:50:30.431958705 +0000 UTC m=+0.091952378 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 23 04:50:30 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:50:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:31 localhost nova_compute[282206]: 2026-02-23 09:50:31.544 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:31 localhost nova_compute[282206]: 2026-02-23 09:50:31.546 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:31 localhost nova_compute[282206]: 2026-02-23 09:50:31.546 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:31 localhost nova_compute[282206]: 2026-02-23 09:50:31.546 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:31 localhost nova_compute[282206]: 2026-02-23 09:50:31.579 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:31 localhost nova_compute[282206]: 2026-02-23 09:50:31.580 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:50:33 localhost podman[306139]: 2026-02-23 09:50:33.911324019 +0000 UTC m=+0.086839754 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:50:33 localhost podman[306139]: 2026-02-23 09:50:33.917209318 +0000 UTC m=+0.092725093 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:50:33 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:50:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:36 localhost nova_compute[282206]: 2026-02-23 09:50:36.581 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:36 localhost nova_compute[282206]: 2026-02-23 09:50:36.583 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:36 localhost nova_compute[282206]: 2026-02-23 09:50:36.584 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:36 localhost nova_compute[282206]: 2026-02-23 09:50:36.584 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:36 localhost nova_compute[282206]: 2026-02-23 09:50:36.604 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:36 localhost nova_compute[282206]: 2026-02-23 09:50:36.605 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:39 localhost podman[242954]: time="2026-02-23T09:50:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:50:39 localhost podman[242954]: @ - - [23/Feb/2026:09:50:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:50:39 localhost podman[242954]: @ - - [23/Feb/2026:09:50:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18290 "" "Go-http-client/1.1" Feb 23 04:50:40 localhost sshd[306157]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:50:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:41 localhost nova_compute[282206]: 2026-02-23 09:50:41.606 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:41 localhost nova_compute[282206]: 2026-02-23 09:50:41.608 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:41 localhost nova_compute[282206]: 2026-02-23 09:50:41.609 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:41 localhost nova_compute[282206]: 2026-02-23 09:50:41.609 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:41 localhost nova_compute[282206]: 2026-02-23 09:50:41.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:41 localhost nova_compute[282206]: 2026-02-23 09:50:41.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:43 localhost openstack_network_exporter[245358]: ERROR 09:50:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:50:43 localhost openstack_network_exporter[245358]: Feb 23 04:50:43 localhost openstack_network_exporter[245358]: ERROR 09:50:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:50:43 localhost openstack_network_exporter[245358]: Feb 23 04:50:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:50:43 localhost podman[306159]: 2026-02-23 09:50:43.913587459 +0000 UTC m=+0.086317127 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:50:43 localhost podman[306159]: 2026-02-23 09:50:43.922783351 +0000 UTC m=+0.095513009 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:50:43 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:50:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:46 localhost nova_compute[282206]: 2026-02-23 09:50:46.645 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:46 localhost nova_compute[282206]: 2026-02-23 09:50:46.647 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:46 localhost nova_compute[282206]: 2026-02-23 09:50:46.647 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:46 localhost nova_compute[282206]: 2026-02-23 09:50:46.648 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:46 localhost nova_compute[282206]: 2026-02-23 09:50:46.678 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:46 localhost nova_compute[282206]: 2026-02-23 09:50:46.678 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:50:46 localhost podman[306184]: 2026-02-23 09:50:46.902280449 +0000 UTC m=+0.078454227 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347) Feb 23 04:50:46 localhost podman[306184]: 2026-02-23 09:50:46.914619557 +0000 UTC m=+0.090793335 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, distribution-scope=public, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:50:46 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:50:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:50:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:50:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:50:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:50:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:50:48.554 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:50:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:51 localhost nova_compute[282206]: 2026-02-23 09:50:51.679 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:51 localhost nova_compute[282206]: 2026-02-23 09:50:51.680 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:51 localhost nova_compute[282206]: 2026-02-23 09:50:51.681 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:51 localhost nova_compute[282206]: 2026-02-23 09:50:51.681 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:51 localhost nova_compute[282206]: 2026-02-23 09:50:51.709 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:51 localhost nova_compute[282206]: 2026-02-23 09:50:51.710 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.135 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.165 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.166 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3a58282-47aa-44ed-b791-edcbd7927f01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.135412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '262e164a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': 'b14f6174a7b1924bed0645e35c409950639554c73b168f152408d47a44cf3844'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.135412', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '262e2cac-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '8a127751fc54542f2917f4a9b25bf66d1b1d39d69bf061b9bb3c1623ecc3bffc'}]}, 'timestamp': '2026-02-23 09:50:56.166856', '_unique_id': '72c79dc885e24a5aaf75dcac365b7922'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.168 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.174 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a8d1c5f-c9fa-4a00-b812-701a44393794', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.170095', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '262f5e6a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': 'cfc02a11302be6d047ec3753f003dea55d030c502fc4d41f9ebc5e9a6a691ce4'}]}, 'timestamp': '2026-02-23 09:50:56.174738', '_unique_id': '88d64e77d50a464ba82072cacb5a41f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.177 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '225f574e-6c26-45b7-a249-66da55fe33ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.177130', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '262fd0c0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '2b007ab4dfc8f541ba8e5e342ad9a29c9cea6cb1379f463e684d0c43557ebfed'}]}, 'timestamp': '2026-02-23 09:50:56.177646', '_unique_id': 'd4990bc6fa3e4c68a7362f1e368bd380'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.180 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3585f430-e7ec-4090-a413-13fbae5e0f99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.179985', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '263040d2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '801ea1908d7040b77258fa5fb5bc52678c398b09b16ba818d643f12f1b845340'}]}, 'timestamp': '2026-02-23 09:50:56.180486', '_unique_id': '6011db3ad72f437f911c8c25d1c63566'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.181 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.182 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd461f23e-13b9-43df-95fd-faf8f397c2f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.182808', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '2630b04e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '4b22dd8aa348e8cadd611bd1e538eca9487624f41c29b7e49c5707f7227f316a'}]}, 'timestamp': '2026-02-23 09:50:56.183343', '_unique_id': '639fedd46f9e43a4a6ca107155509544'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.185 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.186 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49ca14c7-74c3-4480-bf6a-29d2e614478b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.185639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '26311e9e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '449256359c61936e385b3f2bea67f2aeda4ba4d7eb044f91b06182d6beb220c9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.185639', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2631300a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '353fff40f293e529d86f1b233e7af49d383540388187fee0531c058f9f0f1703'}]}, 'timestamp': '2026-02-23 09:50:56.186609', '_unique_id': 'd315b3bf19a24c2b906f360f926723da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.187 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.189 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.189 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '65e4d908-262b-49cc-9b6b-c5d122e22dcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.189276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2631aada-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '70ce45cb8f3f26b262933f8ef52c987a62ba5b56467a1e09b3c72220723cf623'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.189276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2631bdc2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '9e1b7934d1d5d97b9c0153498eec1b78bfdd57c4c8d5c5d461d27cf9e01147e1'}]}, 'timestamp': '2026-02-23 09:50:56.190216', '_unique_id': '1409330b51b9451eb03e38e8d776ee8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.191 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.192 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a2f6191-9e06-4597-ada2-2798f245fe0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.192601', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '26322cf8-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': 'b77c788dc167fbc23cd30562abb86ac7073abea1530daefe43da56a4a99e956f'}]}, 'timestamp': '2026-02-23 09:50:56.193146', '_unique_id': 'ba3b317f3998401bb66dbdcd056e691f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.195 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.195 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81cb3e69-bd4f-4f91-b9f8-d6434dc8c596', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.195618', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '2632a2fa-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '7d887917eb4338632be4343abc17297012526ef3db75beab79b8dcb8caa288ad'}]}, 'timestamp': '2026-02-23 09:50:56.196152', '_unique_id': '3117ca2c405a4788b0bbb9377620481e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd37ce054-8812-410e-ada7-49e55e58a15e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.198469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263314c4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '441594391873d8ce07afad15de5693bb22a3a437a302cd991149b625691c1401'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.198469', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2633287e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': 'cc8e2cd3ca226d4af2d525c8d52768b0b4505fbb87cb533f2b8a538a85ee830b'}]}, 'timestamp': '2026-02-23 09:50:56.199507', '_unique_id': 'f584abf51f81444f8dbd3c87f59f97fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cb80aaf-ea99-4654-9cb4-d0c177e2795d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.201815', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '2633967e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '3dca02fb5375d9fa62ce8cba61c27ced8bb17d9449990f7fb3cd0e55421a7f06'}]}, 'timestamp': '2026-02-23 09:50:56.202338', '_unique_id': '6e1d930d2a6d4d36b088edb3c837814f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.204 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '133bacfd-f0ec-4760-89f5-cbf32f75c347', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.204818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2635e7a8-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '08a640b5173a8b4a2447333a6340f4f87e54b3a0d0030811b1a2aa6025de3211'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.204818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2635f8ec-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '0b9edb8f941872fe2a7b0751d590e62b07e85451a6e2e96c86526f5b4cb5b596'}]}, 'timestamp': '2026-02-23 09:50:56.217990', '_unique_id': 'ca729a09df8f41c7a300e719d86bc2d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.219 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4b95441-09a7-4159-a11e-b3f5eb97c112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.220332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263667d2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': 'cea25f0e81f0ab9b292a597f1402d3d7db656bca0c90616b713e05dd91299cf4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.220332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '26367b46-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '504c07465faef051b050869bf5e3c216557ca99881c6156bb37ad660ebc4a5b8'}]}, 'timestamp': '2026-02-23 09:50:56.221279', '_unique_id': '59fa1e86130b4462b30e43f68813798f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.222 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.223 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55c09baa-55e2-4ce6-aa05-4b0d4b6b957e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.223721', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '2636edb0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '790bed47a7d2adc94d10230301dafe6c5876761dfb27dfa5646ac04665f2ae13'}]}, 'timestamp': '2026-02-23 09:50:56.224254', '_unique_id': 'f243107820434f849e720aa28a8e137b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99e48839-bf61-4381-bc0b-3ffe759f4e5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.226471', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '263758fe-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '0f3d3ee9ff0aed03809053aea663e5c11c90de7c025e5fbc2d179560d19a63bc'}]}, 'timestamp': '2026-02-23 09:50:56.227011', '_unique_id': '1d318e8857f64565aa1d2968c175c959'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '287377ef-0c12-46f2-9023-a00898ebb660', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:50:56.229253', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '263a531a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.435351571, 'message_signature': 'd87e52e61b7c09ee926ab0bc5fb37b0208e3765fa0a78fd33c2c68c833a63093'}]}, 'timestamp': '2026-02-23 09:50:56.246476', '_unique_id': '55daba01d8464c8081fc936f39c078ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.248 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d00a9a2-45c9-4367-98ed-669ae1acd688', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:50:56.248863', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '263ac41c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.359595199, 'message_signature': '95c8977cda96e8a4094e5059a9d390eb0a663b6c9771f8fbb1d05655878ba538'}]}, 'timestamp': '2026-02-23 09:50:56.249379', '_unique_id': 'e507f66dc4bb4f40952b70d6c182238e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b5266e5-9f20-4c9b-a9c8-76de0e25ae40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.251656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263b3136-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '57c50762be8981feff221bba8e300d5d2fd15be6031cc35a1d8a28f355566f05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.251656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '263b427a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '6b8d60b837695dd5e6ca7e9249dabce01d59bdaf9cce94df23ca9cd715a083c5'}]}, 'timestamp': '2026-02-23 09:50:56.252583', '_unique_id': '3966151dbd904fdfbf99c0fca8b945bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 12010000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4261036-a3bc-4ea7-86a0-0f39378a823c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12010000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:50:56.254939', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '263bb0ac-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.435351571, 'message_signature': '1a5ba7625224179ca8a70c0f2add09c4300778f144d1eb4cf197015e89a460f6'}]}, 'timestamp': '2026-02-23 09:50:56.255422', '_unique_id': '134f432099aa4457a413ae858f0351e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c1cdcb3-31da-4cb5-8802-b961dde47dff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.257725', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263c1d94-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '355d574f8780627a375794b8a15cfa0e3fe3d00c44cb5b2d87dccf66ed4ed59d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.257725', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '263c2fa0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.324894935, 'message_signature': '4dadbf8684b2996d70d98bf04e1b07f49df49030225d901a940f8472f9d0144a'}]}, 'timestamp': '2026-02-23 09:50:56.258683', '_unique_id': '8ab393d45687486eb4eda2a926dbf40d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.260 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a958edb9-ba3f-4d96-b8a0-8c2d2ee9ab3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:50:56.261012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '263c9d96-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '5631222fe4bd04d006772f3c006389eddfa2850c325a3ed390008e5fe8d68785'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:50:56.261012', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '263cae6c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11696.394350654, 'message_signature': '68c92ecb5f75194ebc39bfff1ed149b6de0926d34ff3ec8c3533251feb7840d0'}]}, 'timestamp': '2026-02-23 09:50:56.261933', '_unique_id': 'f965329f0fa041a98b0ad1dbb04536e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 04:50:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:50:56.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:50:56 localhost nova_compute[282206]: 2026-02-23 09:50:56.710 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:56 localhost nova_compute[282206]: 2026-02-23 09:50:56.742 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:50:56 localhost nova_compute[282206]: 2026-02-23 09:50:56.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5032 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:50:56 localhost nova_compute[282206]: 2026-02-23 09:50:56.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:56 localhost nova_compute[282206]: 2026-02-23 09:50:56.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:50:56 localhost nova_compute[282206]: 2026-02-23 09:50:56.746 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:50:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:50:57 localhost systemd[1]: tmp-crun.BDjUDM.mount: Deactivated successfully. Feb 23 04:50:57 localhost podman[306203]: 2026-02-23 09:50:57.922995747 +0000 UTC m=+0.094599731 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:50:58 localhost podman[306204]: 2026-02-23 09:50:58.009045905 +0000 UTC m=+0.175237533 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:50:58 localhost podman[306204]: 2026-02-23 09:50:58.02126236 +0000 UTC m=+0.187453968 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:50:58 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:50:58 localhost podman[306203]: 2026-02-23 09:50:58.062942778 +0000 UTC m=+0.234546732 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0) Feb 23 04:50:58 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:50:58 localhost systemd[1]: tmp-crun.Q1n8DA.mount: Deactivated successfully. Feb 23 04:51:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:51:00 localhost podman[306251]: 2026-02-23 09:51:00.908852639 +0000 UTC m=+0.083346656 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute) Feb 23 04:51:00 localhost podman[306251]: 2026-02-23 09:51:00.921248239 +0000 UTC m=+0.095742236 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute) Feb 23 04:51:00 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:51:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:01 localhost nova_compute[282206]: 2026-02-23 09:51:01.747 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:51:01 localhost nova_compute[282206]: 2026-02-23 09:51:01.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:51:01 localhost nova_compute[282206]: 2026-02-23 09:51:01.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:51:01 localhost nova_compute[282206]: 2026-02-23 09:51:01.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:51:01 localhost nova_compute[282206]: 2026-02-23 09:51:01.790 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:01 localhost nova_compute[282206]: 2026-02-23 09:51:01.791 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:51:03 localhost sshd[306270]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:51:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:51:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:51:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:51:04 localhost podman[306359]: 2026-02-23 09:51:04.795937861 +0000 UTC m=+0.086903545 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:51:04 localhost podman[306359]: 2026-02-23 09:51:04.802528873 +0000 UTC m=+0.093494557 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true) Feb 23 04:51:04 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:51:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mgr fail"} v 0) Feb 23 04:51:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:51:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e85 do_prune osdmap full prune enabled Feb 23 04:51:04 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Activating manager daemon np0005626465.hlpkwo Feb 23 04:51:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 e86: 6 total, 6 up, 6 in Feb 23 04:51:04 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e86: 6 total, 6 up, 6 in Feb 23 04:51:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 23 04:51:04 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e40: np0005626465.hlpkwo(active, starting, since 0.0333241s), standbys: np0005626463.wtksup Feb 23 04:51:05 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Manager daemon np0005626465.hlpkwo is now available Feb 23 04:51:05 localhost systemd-logind[759]: Session 70 logged out. Waiting for processes to exit. Feb 23 04:51:05 localhost systemd[1]: session-70.scope: Deactivated successfully. Feb 23 04:51:05 localhost systemd[1]: session-70.scope: Consumed 10.431s CPU time. Feb 23 04:51:05 localhost systemd-logind[759]: Removed session 70. Feb 23 04:51:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} v 0) Feb 23 04:51:05 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch Feb 23 04:51:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} v 0) Feb 23 04:51:05 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch Feb 23 04:51:05 localhost sshd[306377]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:51:05 localhost systemd-logind[759]: New session 71 of user ceph-admin. Feb 23 04:51:05 localhost systemd[1]: Started Session 71 of User ceph-admin. Feb 23 04:51:05 localhost ceph-mon[294160]: from='mgr.26638 172.18.0.108:0/2769928129' entity='mgr.np0005626466.nisqfq' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:51:05 localhost ceph-mon[294160]: from='mgr.26638 ' entity='mgr.np0005626466.nisqfq' Feb 23 04:51:05 localhost ceph-mon[294160]: from='client.? 172.18.0.200:0/2649255566' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:51:05 localhost ceph-mon[294160]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Feb 23 04:51:05 localhost ceph-mon[294160]: Activating manager daemon np0005626465.hlpkwo Feb 23 04:51:05 localhost ceph-mon[294160]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Feb 23 04:51:05 localhost ceph-mon[294160]: Manager daemon np0005626465.hlpkwo is now available Feb 23 04:51:05 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch Feb 23 04:51:05 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/mirror_snapshot_schedule"} : dispatch Feb 23 04:51:05 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch Feb 23 04:51:05 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005626465.hlpkwo/trash_purge_schedule"} : dispatch Feb 23 04:51:05 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e41: np0005626465.hlpkwo(active, since 1.05196s), standbys: np0005626463.wtksup Feb 23 04:51:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:06 localhost podman[306488]: 2026-02-23 09:51:06.492167105 +0000 UTC m=+0.099676517 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.42.2, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph) Feb 23 04:51:06 localhost podman[306488]: 2026-02-23 09:51:06.625578415 +0000 UTC m=+0.233087837 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, maintainer=Guillaume Abrioux , name=rhceph, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.buildah.version=1.42.2, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, CEPH_POINT_RELEASE=) Feb 23 04:51:06 localhost nova_compute[282206]: 2026-02-23 09:51:06.791 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:51:06 localhost nova_compute[282206]: 2026-02-23 09:51:06.794 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:51:06 localhost nova_compute[282206]: 2026-02-23 09:51:06.795 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:51:06 localhost nova_compute[282206]: 2026-02-23 09:51:06.795 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:51:06 localhost nova_compute[282206]: 2026-02-23 09:51:06.846 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:06 localhost nova_compute[282206]: 2026-02-23 09:51:06.847 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:51:06 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 23 04:51:06 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 23 04:51:06 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : Cluster is now healthy Feb 23 04:51:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:51:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:51:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:07 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e42: np0005626465.hlpkwo(active, since 2s), standbys: np0005626463.wtksup Feb 23 04:51:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:51:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:51:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:51:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:51:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Bus STARTING Feb 23 04:51:08 localhost ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Serving on http://172.18.0.107:8765 Feb 23 04:51:08 localhost ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Serving on https://172.18.0.107:7150 Feb 23 04:51:08 localhost ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Bus STARTED Feb 23 04:51:08 localhost ceph-mon[294160]: [23/Feb/2026:09:51:06] ENGINE Client ('172.18.0.107', 34908) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Feb 23 04:51:08 localhost ceph-mon[294160]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Feb 23 04:51:08 localhost ceph-mon[294160]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Feb 23 04:51:08 localhost ceph-mon[294160]: Cluster is now healthy Feb 23 04:51:08 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:51:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:51:09 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:09 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Feb 23 04:51:09 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Feb 23 04:51:09 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 04:51:09 localhost podman[242954]: time="2026-02-23T09:51:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:51:09 localhost podman[242954]: @ - - [23/Feb/2026:09:51:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:51:09 localhost podman[242954]: @ - - [23/Feb/2026:09:51:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18284 "" "Go-http-client/1.1" Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 04:51:09 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 04:51:09 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 04:51:09 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 04:51:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:51:09 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.conf Feb 23 04:51:09 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.conf Feb 23 04:51:09 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.conf Feb 23 04:51:09 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e43: np0005626465.hlpkwo(active, since 4s), standbys: np0005626463.wtksup Feb 23 04:51:09 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : Standby manager daemon np0005626466.nisqfq started Feb 23 04:51:10 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:51:10 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:51:10 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.conf Feb 23 04:51:10 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e44: np0005626465.hlpkwo(active, since 5s), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 04:51:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:11 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:51:11 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:51:11 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/etc/ceph/ceph.client.admin.keyring Feb 23 04:51:11 localhost ceph-mon[294160]: Updating np0005626465.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:51:11 localhost ceph-mon[294160]: Updating np0005626463.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:51:11 localhost ceph-mon[294160]: Updating np0005626466.localdomain:/var/lib/ceph/f1fea371-cb69-578d-a3d0-b5c472a84b46/config/ceph.client.admin.keyring Feb 23 04:51:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:51:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:51:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:51:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:51:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:51:11 localhost nova_compute[282206]: 2026-02-23 09:51:11.847 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:51:11 localhost nova_compute[282206]: 2026-02-23 09:51:11.849 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:51:11 localhost nova_compute[282206]: 2026-02-23 09:51:11.849 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:51:11 localhost nova_compute[282206]: 2026-02-23 09:51:11.850 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:51:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:51:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:51:11 localhost nova_compute[282206]: 2026-02-23 09:51:11.888 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:11 localhost nova_compute[282206]: 2026-02-23 09:51:11.888 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:51:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:51:12 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:51:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:13 localhost openstack_network_exporter[245358]: ERROR 09:51:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:51:13 localhost openstack_network_exporter[245358]: Feb 23 04:51:13 localhost openstack_network_exporter[245358]: ERROR 09:51:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:51:13 localhost openstack_network_exporter[245358]: Feb 23 04:51:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:51:14 localhost podman[307404]: 2026-02-23 09:51:14.910424814 +0000 UTC m=+0.085347368 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:51:14 localhost podman[307404]: 2026-02-23 09:51:14.924366742 +0000 UTC m=+0.099289286 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:51:14 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:51:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:51:15 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:16 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:51:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:16 localhost nova_compute[282206]: 2026-02-23 09:51:16.887 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:17 localhost nova_compute[282206]: 2026-02-23 09:51:17.483 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:17 localhost nova_compute[282206]: 2026-02-23 09:51:17.484 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:51:17 localhost nova_compute[282206]: 2026-02-23 09:51:17.485 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:51:17 localhost nova_compute[282206]: 2026-02-23 09:51:17.582 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:51:17 localhost nova_compute[282206]: 2026-02-23 09:51:17.583 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:51:17 localhost nova_compute[282206]: 2026-02-23 09:51:17.583 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:51:17 localhost nova_compute[282206]: 2026-02-23 09:51:17.584 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:51:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:51:17 localhost systemd[1]: tmp-crun.CMYvMQ.mount: Deactivated successfully. Feb 23 04:51:17 localhost podman[307427]: 2026-02-23 09:51:17.907980645 +0000 UTC m=+0.084279005 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., version=9.7, release=1770267347, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:51:17 localhost podman[307427]: 2026-02-23 09:51:17.924204412 +0000 UTC m=+0.100502762 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, version=9.7, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc.) Feb 23 04:51:17 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:51:18 localhost nova_compute[282206]: 2026-02-23 09:51:18.295 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:51:18 localhost nova_compute[282206]: 2026-02-23 09:51:18.314 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:51:18 localhost nova_compute[282206]: 2026-02-23 09:51:18.314 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:51:18 localhost nova_compute[282206]: 2026-02-23 09:51:18.315 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:19 localhost nova_compute[282206]: 2026-02-23 09:51:19.882 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:20 localhost nova_compute[282206]: 2026-02-23 09:51:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:20 localhost nova_compute[282206]: 2026-02-23 09:51:20.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:20 localhost nova_compute[282206]: 2026-02-23 09:51:20.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:51:21 localhost nova_compute[282206]: 2026-02-23 09:51:21.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:21 localhost nova_compute[282206]: 2026-02-23 09:51:21.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:21 localhost nova_compute[282206]: 2026-02-23 09:51:21.889 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:51:22 localhost nova_compute[282206]: 2026-02-23 09:51:22.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:23 localhost nova_compute[282206]: 2026-02-23 09:51:23.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.077 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.077 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.078 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:51:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:51:24 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3454500488' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.552 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.608 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.609 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.809 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.811 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11743MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.811 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.812 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.876 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.877 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.877 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:51:24 localhost nova_compute[282206]: 2026-02-23 09:51:24.918 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:51:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:51:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1539052837' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:51:25 localhost nova_compute[282206]: 2026-02-23 09:51:25.350 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:51:25 localhost nova_compute[282206]: 2026-02-23 09:51:25.356 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:51:25 localhost nova_compute[282206]: 2026-02-23 09:51:25.389 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:51:25 localhost nova_compute[282206]: 2026-02-23 09:51:25.392 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:51:25 localhost nova_compute[282206]: 2026-02-23 09:51:25.393 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:51:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:26 localhost nova_compute[282206]: 2026-02-23 09:51:26.892 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.278362) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287278441, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2447, "num_deletes": 256, "total_data_size": 6625461, "memory_usage": 6807144, "flush_reason": "Manual Compaction"} Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287312148, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 6126102, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20075, "largest_seqno": 22521, "table_properties": {"data_size": 6115243, "index_size": 6788, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 26255, "raw_average_key_size": 22, "raw_value_size": 6092245, "raw_average_value_size": 5145, "num_data_blocks": 292, "num_entries": 1184, "num_filter_entries": 1184, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840180, "oldest_key_time": 1771840180, "file_creation_time": 1771840287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 33841 microseconds, and 14261 cpu microseconds. Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.312210) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 6126102 bytes OK Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.312239) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.314457) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.314480) EVENT_LOG_v1 {"time_micros": 1771840287314474, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.314503) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 6614376, prev total WAL file size 6614376, number of live WAL files 2. Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.315836) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end) Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(5982KB)], [33(15MB)] Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287315915, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 22457026, "oldest_snapshot_seqno": -1} Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12283 keys, 19493283 bytes, temperature: kUnknown Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287444935, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 19493283, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19420243, "index_size": 41259, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30725, "raw_key_size": 327917, "raw_average_key_size": 26, "raw_value_size": 19207895, "raw_average_value_size": 1563, "num_data_blocks": 1588, "num_entries": 12283, "num_filter_entries": 12283, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.445203) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 19493283 bytes Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.446960) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.0 rd, 151.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.8, 15.6 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(6.8) write-amplify(3.2) OK, records in: 12823, records dropped: 540 output_compression: NoCompression Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.446991) EVENT_LOG_v1 {"time_micros": 1771840287446979, "job": 18, "event": "compaction_finished", "compaction_time_micros": 129074, "compaction_time_cpu_micros": 53367, "output_level": 6, "num_output_files": 1, "total_output_size": 19493283, "num_input_records": 12823, "num_output_records": 12283, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287448031, "job": 18, "event": "table_file_deletion", "file_number": 35} Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840287450458, "job": 18, "event": "table_file_deletion", "file_number": 33} Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.315701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450547) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450555) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450559) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450562) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:27 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:27.450565) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:51:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:51:28 localhost podman[307492]: 2026-02-23 09:51:28.912056382 +0000 UTC m=+0.085321477 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS) Feb 23 04:51:28 localhost podman[307493]: 2026-02-23 09:51:28.965026156 +0000 UTC m=+0.134987000 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:51:28 localhost podman[307493]: 2026-02-23 09:51:28.976474807 +0000 UTC m=+0.146435661 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:51:28 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:51:29 localhost podman[307492]: 2026-02-23 09:51:29.027022097 +0000 UTC m=+0.200287182 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0) Feb 23 04:51:29 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:51:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:51:31 localhost nova_compute[282206]: 2026-02-23 09:51:31.894 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:51:31 localhost nova_compute[282206]: 2026-02-23 09:51:31.897 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:31 localhost podman[307544]: 2026-02-23 09:51:31.909992543 +0000 UTC m=+0.085233204 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:51:31 localhost podman[307544]: 2026-02-23 09:51:31.920994861 +0000 UTC m=+0.096235472 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:51:31 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:51:32 localhost sshd[307563]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:51:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:51:35 localhost podman[307565]: 2026-02-23 09:51:35.910654597 +0000 UTC m=+0.084924784 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:51:35 localhost podman[307565]: 2026-02-23 09:51:35.94303004 +0000 UTC m=+0.117300237 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:51:35 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:51:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.243151) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296243187, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 323, "num_deletes": 251, "total_data_size": 88314, "memory_usage": 95160, "flush_reason": "Manual Compaction"} Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296249717, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 86555, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22522, "largest_seqno": 22844, "table_properties": {"data_size": 84483, "index_size": 247, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5806, "raw_average_key_size": 20, "raw_value_size": 80359, "raw_average_value_size": 280, "num_data_blocks": 11, "num_entries": 286, "num_filter_entries": 286, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840289, "oldest_key_time": 1771840289, "file_creation_time": 1771840296, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 6608 microseconds, and 1085 cpu microseconds. Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.249760) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 86555 bytes OK Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.249779) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.251721) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.251747) EVENT_LOG_v1 {"time_micros": 1771840296251741, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.251767) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 86066, prev total WAL file size 86390, number of live WAL files 2. Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.252606) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373532' seq:72057594037927935, type:22 .. '6D6772737461740034303034' seq:0, type:0; will stop at (end) Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(84KB)], [36(18MB)] Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296252648, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 19579838, "oldest_snapshot_seqno": -1} Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12054 keys, 17350766 bytes, temperature: kUnknown Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296363859, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 17350766, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17284081, "index_size": 35480, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30149, "raw_key_size": 323351, "raw_average_key_size": 26, "raw_value_size": 17080516, "raw_average_value_size": 1416, "num_data_blocks": 1346, "num_entries": 12054, "num_filter_entries": 12054, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840296, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.364243) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 17350766 bytes Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.365967) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.8 rd, 155.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 18.6 +0.0 blob) out(16.5 +0.0 blob), read-write-amplify(426.7) write-amplify(200.5) OK, records in: 12569, records dropped: 515 output_compression: NoCompression Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.365999) EVENT_LOG_v1 {"time_micros": 1771840296365983, "job": 20, "event": "compaction_finished", "compaction_time_micros": 111375, "compaction_time_cpu_micros": 49515, "output_level": 6, "num_output_files": 1, "total_output_size": 17350766, "num_input_records": 12569, "num_output_records": 12054, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296366162, "job": 20, "event": "table_file_deletion", "file_number": 38} Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840296368918, "job": 20, "event": "table_file_deletion", "file_number": 36} Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.252538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.368985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.368992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.368995) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.368998) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:36 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:51:36.369002) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:51:36 localhost nova_compute[282206]: 2026-02-23 09:51:36.896 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:39 localhost podman[242954]: time="2026-02-23T09:51:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:51:39 localhost podman[242954]: @ - - [23/Feb/2026:09:51:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:51:39 localhost podman[242954]: @ - - [23/Feb/2026:09:51:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18286 "" "Go-http-client/1.1" Feb 23 04:51:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:41 localhost nova_compute[282206]: 2026-02-23 09:51:41.898 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:43 localhost openstack_network_exporter[245358]: ERROR 09:51:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:51:43 localhost openstack_network_exporter[245358]: Feb 23 04:51:43 localhost openstack_network_exporter[245358]: ERROR 09:51:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:51:43 localhost openstack_network_exporter[245358]: Feb 23 04:51:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:51:45 localhost systemd[1]: tmp-crun.f64STc.mount: Deactivated successfully. Feb 23 04:51:45 localhost podman[307583]: 2026-02-23 09:51:45.90740763 +0000 UTC m=+0.083496210 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:51:45 localhost podman[307583]: 2026-02-23 09:51:45.921306086 +0000 UTC m=+0.097394676 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:51:45 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:51:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:46 localhost nova_compute[282206]: 2026-02-23 09:51:46.901 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:51:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:51:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:51:48.553 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:51:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:51:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:51:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:51:48 localhost systemd[1]: tmp-crun.CAY8qW.mount: Deactivated successfully. Feb 23 04:51:48 localhost podman[307607]: 2026-02-23 09:51:48.913423088 +0000 UTC m=+0.086943766 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, version=9.7, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:51:48 localhost podman[307607]: 2026-02-23 09:51:48.929400979 +0000 UTC m=+0.102921647 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9) Feb 23 04:51:48 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:51:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:51 localhost nova_compute[282206]: 2026-02-23 09:51:51.903 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:51:56 localhost nova_compute[282206]: 2026-02-23 09:51:56.905 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:51:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:51:59 localhost systemd[1]: tmp-crun.GYOjMW.mount: Deactivated successfully. Feb 23 04:51:59 localhost podman[307627]: 2026-02-23 09:51:59.91303019 +0000 UTC m=+0.088550346 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:51:59 localhost podman[307628]: 2026-02-23 09:51:59.974902087 +0000 UTC m=+0.144292064 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:51:59 localhost podman[307628]: 2026-02-23 09:51:59.98642326 +0000 UTC m=+0.155813237 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:51:59 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:52:00 localhost podman[307627]: 2026-02-23 09:52:00.001770101 +0000 UTC m=+0.177290207 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 23 04:52:00 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:52:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:01 localhost nova_compute[282206]: 2026-02-23 09:52:01.907 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:52:02 localhost podman[307675]: 2026-02-23 09:52:02.90503371 +0000 UTC m=+0.081208391 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Feb 23 04:52:02 localhost podman[307675]: 2026-02-23 09:52:02.915198282 +0000 UTC m=+0.091373003 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true) Feb 23 04:52:02 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:52:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:52:06 localhost podman[307694]: 2026-02-23 09:52:06.909156489 +0000 UTC m=+0.081210801 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:52:06 localhost nova_compute[282206]: 2026-02-23 09:52:06.909 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:06 localhost podman[307694]: 2026-02-23 09:52:06.922309532 +0000 UTC m=+0.094363834 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true) Feb 23 04:52:06 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:52:09 localhost podman[242954]: time="2026-02-23T09:52:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:52:09 localhost podman[242954]: @ - - [23/Feb/2026:09:52:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155258 "" "Go-http-client/1.1" Feb 23 04:52:09 localhost podman[242954]: @ - - [23/Feb/2026:09:52:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18285 "" "Go-http-client/1.1" Feb 23 04:52:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:11 localhost nova_compute[282206]: 2026-02-23 09:52:11.912 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:52:13 localhost openstack_network_exporter[245358]: ERROR 09:52:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:52:13 localhost openstack_network_exporter[245358]: Feb 23 04:52:13 localhost openstack_network_exporter[245358]: ERROR 09:52:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:52:13 localhost openstack_network_exporter[245358]: Feb 23 04:52:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:52:14 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:52:14 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:52:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:52:15 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:52:16 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:52:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:52:16 localhost nova_compute[282206]: 2026-02-23 09:52:16.914 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:52:16 localhost podman[307797]: 2026-02-23 09:52:16.919911161 +0000 UTC m=+0.088198335 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:52:16 localhost podman[307797]: 2026-02-23 09:52:16.935320313 +0000 UTC m=+0.103607467 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:52:16 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:52:18 localhost nova_compute[282206]: 2026-02-23 09:52:18.394 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:18 localhost nova_compute[282206]: 2026-02-23 09:52:18.394 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:52:18 localhost nova_compute[282206]: 2026-02-23 09:52:18.395 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:52:18 localhost nova_compute[282206]: 2026-02-23 09:52:18.988 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:52:18 localhost nova_compute[282206]: 2026-02-23 09:52:18.988 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:52:18 localhost nova_compute[282206]: 2026-02-23 09:52:18.988 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:52:18 localhost nova_compute[282206]: 2026-02-23 09:52:18.988 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:52:19 localhost nova_compute[282206]: 2026-02-23 09:52:19.386 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:52:19 localhost nova_compute[282206]: 2026-02-23 09:52:19.403 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:52:19 localhost nova_compute[282206]: 2026-02-23 09:52:19.403 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:52:19 localhost nova_compute[282206]: 2026-02-23 09:52:19.404 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:52:19 localhost podman[307819]: 2026-02-23 09:52:19.912426597 +0000 UTC m=+0.082275553 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9/ubi-minimal) Feb 23 04:52:19 localhost podman[307819]: 2026-02-23 09:52:19.929633835 +0000 UTC m=+0.099482791 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.) Feb 23 04:52:19 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:52:20 localhost nova_compute[282206]: 2026-02-23 09:52:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:20 localhost nova_compute[282206]: 2026-02-23 09:52:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:20 localhost nova_compute[282206]: 2026-02-23 09:52:20.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:52:21 localhost nova_compute[282206]: 2026-02-23 09:52:21.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:21 localhost nova_compute[282206]: 2026-02-23 09:52:21.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:21 localhost nova_compute[282206]: 2026-02-23 09:52:21.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:52:21 localhost nova_compute[282206]: 2026-02-23 09:52:21.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:21 localhost nova_compute[282206]: 2026-02-23 09:52:21.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:52:21 localhost nova_compute[282206]: 2026-02-23 09:52:21.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:52:21 localhost nova_compute[282206]: 2026-02-23 09:52:21.919 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:52:21 localhost nova_compute[282206]: 2026-02-23 09:52:21.921 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:22 localhost nova_compute[282206]: 2026-02-23 09:52:22.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:22 localhost sshd[307839]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:52:24 localhost nova_compute[282206]: 2026-02-23 09:52:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.076 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.077 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:52:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:52:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2038369328' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.511 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.578 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.579 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.777 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.779 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11734MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.779 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.780 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.854 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.854 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.855 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:52:25 localhost nova_compute[282206]: 2026-02-23 09:52:25.900 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:52:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:52:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1701877501' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:52:26 localhost nova_compute[282206]: 2026-02-23 09:52:26.460 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:52:26 localhost nova_compute[282206]: 2026-02-23 09:52:26.466 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:52:26 localhost nova_compute[282206]: 2026-02-23 09:52:26.484 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:52:26 localhost nova_compute[282206]: 2026-02-23 09:52:26.487 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:52:26 localhost nova_compute[282206]: 2026-02-23 09:52:26.487 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:52:26 localhost nova_compute[282206]: 2026-02-23 09:52:26.922 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:30 localhost nova_compute[282206]: 2026-02-23 09:52:30.300 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:30 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:30.301 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:52:30 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:30.302 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:52:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:52:30 localhost podman[307886]: 2026-02-23 09:52:30.916986528 +0000 UTC m=+0.087625047 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:52:30 localhost podman[307886]: 2026-02-23 09:52:30.954500879 +0000 UTC m=+0.125139438 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:52:30 localhost podman[307885]: 2026-02-23 09:52:30.967323542 +0000 UTC m=+0.138520728 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:52:30 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:52:31 localhost podman[307885]: 2026-02-23 09:52:31.081235834 +0000 UTC m=+0.252432950 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:52:31 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:52:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:31 localhost nova_compute[282206]: 2026-02-23 09:52:31.925 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:32.305 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:52:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:32.760 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpu3w1xssi/privsep.sock']#033[00m Feb 23 04:52:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.383 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 23 04:52:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.264 307935 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 23 04:52:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.269 307935 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 23 04:52:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.273 307935 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Feb 23 04:52:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.273 307935 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307935#033[00m Feb 23 04:52:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:52:33 localhost podman[307940]: 2026-02-23 09:52:33.909122412 +0000 UTC m=+0.082194551 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 04:52:33 localhost podman[307940]: 2026-02-23 09:52:33.922207563 +0000 UTC m=+0.095279702 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 04:52:33 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:52:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:33.968 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpmznh5rez/privsep.sock']#033[00m Feb 23 04:52:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.648 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 23 04:52:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.508 307963 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 23 04:52:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.514 307963 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 23 04:52:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.517 307963 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Feb 23 04:52:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:34.518 307963 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307963#033[00m Feb 23 04:52:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e86 do_prune osdmap full prune enabled Feb 23 04:52:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e87 e87: 6 total, 6 up, 6 in Feb 23 04:52:35 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e87: 6 total, 6 up, 6 in Feb 23 04:52:35 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:35.560 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpu8soh03c/privsep.sock']#033[00m Feb 23 04:52:36 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.159 265541 INFO oslo.privsep.daemon [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Feb 23 04:52:36 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.058 307975 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Feb 23 04:52:36 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.062 307975 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Feb 23 04:52:36 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.066 307975 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Feb 23 04:52:36 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:36.066 307975 INFO oslo.privsep.daemon [-] privsep daemon running as pid 307975#033[00m Feb 23 04:52:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:36 localhost nova_compute[282206]: 2026-02-23 09:52:36.929 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e87 do_prune osdmap full prune enabled Feb 23 04:52:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 e88: 6 total, 6 up, 6 in Feb 23 04:52:37 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e88: 6 total, 6 up, 6 in Feb 23 04:52:37 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e45: np0005626465.hlpkwo(active, since 92s), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 04:52:37 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:37.593 265541 INFO neutron.agent.linux.ip_lib [None req-4e1c89ef-b6d9-48ca-81aa-4080e1a293c6 - - - - - -] Device tapa6e249e9-2a cannot be used as it has no MAC address#033[00m Feb 23 04:52:37 localhost nova_compute[282206]: 2026-02-23 09:52:37.708 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:37 localhost kernel: device tapa6e249e9-2a entered promiscuous mode Feb 23 04:52:37 localhost ovn_controller[157695]: 2026-02-23T09:52:37Z|00074|binding|INFO|Claiming lport a6e249e9-2ac1-4cf8-814c-e468492579ad for this chassis. Feb 23 04:52:37 localhost ovn_controller[157695]: 2026-02-23T09:52:37Z|00075|binding|INFO|a6e249e9-2ac1-4cf8-814c-e468492579ad: Claiming unknown Feb 23 04:52:37 localhost NetworkManager[5974]: [1771840357.7236] manager: (tapa6e249e9-2a): new Generic device (/org/freedesktop/NetworkManager/Devices/17) Feb 23 04:52:37 localhost nova_compute[282206]: 2026-02-23 09:52:37.723 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:37 localhost systemd-udevd[307991]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:52:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:52:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:37.736 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-f52ac7ca-e197-490d-a7bf-412806b20437', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f52ac7ca-e197-490d-a7bf-412806b20437', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f00c2d7924384b97b57547b4797141dc', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d438fe50-32ab-4d61-865c-d5c18259e35d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a6e249e9-2ac1-4cf8-814c-e468492579ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:52:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:37.737 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a6e249e9-2ac1-4cf8-814c-e468492579ad in datapath f52ac7ca-e197-490d-a7bf-412806b20437 bound to our chassis#033[00m Feb 23 04:52:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:37.740 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port 38c2bb38-6d42-43ab-8465-bf9a5286871e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 23 04:52:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:37.740 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f52ac7ca-e197-490d-a7bf-412806b20437, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:52:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:37.743 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[9cc62bd7-f464-4f55-83b9-34aeadc04de3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:52:37 localhost journal[231253]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, ) Feb 23 04:52:37 localhost ovn_controller[157695]: 2026-02-23T09:52:37Z|00076|binding|INFO|Setting lport a6e249e9-2ac1-4cf8-814c-e468492579ad ovn-installed in OVS Feb 23 04:52:37 localhost journal[231253]: hostname: np0005626463.localdomain Feb 23 04:52:37 localhost journal[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device Feb 23 04:52:37 localhost ovn_controller[157695]: 2026-02-23T09:52:37Z|00077|binding|INFO|Setting lport a6e249e9-2ac1-4cf8-814c-e468492579ad up in Southbound Feb 23 04:52:37 localhost nova_compute[282206]: 2026-02-23 09:52:37.753 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:37 localhost journal[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device Feb 23 04:52:37 localhost journal[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device Feb 23 04:52:37 localhost journal[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device Feb 23 04:52:37 localhost journal[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device Feb 23 04:52:37 localhost journal[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device Feb 23 04:52:37 localhost journal[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device Feb 23 04:52:37 localhost journal[231253]: ethtool ioctl error on tapa6e249e9-2a: No such device Feb 23 04:52:37 localhost nova_compute[282206]: 2026-02-23 09:52:37.802 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:37 localhost nova_compute[282206]: 2026-02-23 09:52:37.827 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:37 localhost systemd[1]: tmp-crun.VOH4Jg.mount: Deactivated successfully. Feb 23 04:52:37 localhost podman[307993]: 2026-02-23 09:52:37.875765623 +0000 UTC m=+0.129144291 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 04:52:37 localhost podman[307993]: 2026-02-23 09:52:37.880691244 +0000 UTC m=+0.134069892 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:52:37 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:52:38 localhost podman[308083]: Feb 23 04:52:38 localhost podman[308083]: 2026-02-23 09:52:38.758913939 +0000 UTC m=+0.098382828 container create f4fb933ef414244081c84e33388df31570fc83c922a1140ac94ef0bca735ba11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f52ac7ca-e197-490d-a7bf-412806b20437, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:52:38 localhost podman[308083]: 2026-02-23 09:52:38.707285626 +0000 UTC m=+0.046754535 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:52:38 localhost systemd[1]: Started libpod-conmon-f4fb933ef414244081c84e33388df31570fc83c922a1140ac94ef0bca735ba11.scope. Feb 23 04:52:38 localhost systemd[1]: Started libcrun container. Feb 23 04:52:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0584738fbc7368073bf3c71d73f757000942af3fc730583c035cdde3d707c677/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:52:38 localhost podman[308083]: 2026-02-23 09:52:38.847589437 +0000 UTC m=+0.187058316 container init f4fb933ef414244081c84e33388df31570fc83c922a1140ac94ef0bca735ba11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f52ac7ca-e197-490d-a7bf-412806b20437, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:52:38 localhost podman[308083]: 2026-02-23 09:52:38.85615665 +0000 UTC m=+0.195625529 container start f4fb933ef414244081c84e33388df31570fc83c922a1140ac94ef0bca735ba11 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f52ac7ca-e197-490d-a7bf-412806b20437, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 23 04:52:38 localhost dnsmasq[308101]: started, version 2.85 cachesize 150 Feb 23 04:52:38 localhost dnsmasq[308101]: DNS service limited to local subnets Feb 23 04:52:38 localhost dnsmasq[308101]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:52:38 localhost dnsmasq[308101]: warning: no upstream servers configured Feb 23 04:52:38 localhost dnsmasq-dhcp[308101]: DHCP, static leases only on 192.168.199.0, lease time 1d Feb 23 04:52:38 localhost dnsmasq[308101]: read /var/lib/neutron/dhcp/f52ac7ca-e197-490d-a7bf-412806b20437/addn_hosts - 0 addresses Feb 23 04:52:38 localhost dnsmasq-dhcp[308101]: read /var/lib/neutron/dhcp/f52ac7ca-e197-490d-a7bf-412806b20437/host Feb 23 04:52:38 localhost dnsmasq-dhcp[308101]: read /var/lib/neutron/dhcp/f52ac7ca-e197-490d-a7bf-412806b20437/opts Feb 23 04:52:39 localhost podman[242954]: time="2026-02-23T09:52:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:52:39 localhost podman[242954]: @ - - [23/Feb/2026:09:52:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 04:52:39 localhost podman[242954]: @ - - [23/Feb/2026:09:52:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1" Feb 23 04:52:39 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:52:39.981 265541 INFO neutron.agent.dhcp.agent [None req-970079b0-9620-418a-9f8a-2f59947e4e0c - - - - - -] DHCP configuration for ports {'4d46b2db-eaf9-4e89-bf4e-88bee8d2ccef'} is completed#033[00m Feb 23 04:52:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:41 localhost nova_compute[282206]: 2026-02-23 09:52:41.931 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:41 localhost nova_compute[282206]: 2026-02-23 09:52:41.936 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:43 localhost openstack_network_exporter[245358]: ERROR 09:52:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:52:43 localhost openstack_network_exporter[245358]: Feb 23 04:52:43 localhost openstack_network_exporter[245358]: ERROR 09:52:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:52:43 localhost openstack_network_exporter[245358]: Feb 23 04:52:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:46 localhost nova_compute[282206]: 2026-02-23 09:52:46.933 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:46 localhost nova_compute[282206]: 2026-02-23 09:52:46.937 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:52:47 localhost podman[308103]: 2026-02-23 09:52:47.921527408 +0000 UTC m=+0.088866915 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:52:47 localhost podman[308103]: 2026-02-23 09:52:47.964562398 +0000 UTC m=+0.131901875 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:52:47 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:52:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:48.554 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:52:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:52:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:52:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:52:50 localhost podman[308127]: 2026-02-23 09:52:50.920579414 +0000 UTC m=+0.088749002 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.buildah.version=1.33.7, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal) Feb 23 04:52:50 localhost podman[308127]: 2026-02-23 09:52:50.965364217 +0000 UTC m=+0.133533785 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, release=1770267347, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 04:52:50 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:52:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:51 localhost nova_compute[282206]: 2026-02-23 09:52:51.936 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:51 localhost nova_compute[282206]: 2026-02-23 09:52:51.940 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.136 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.149 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.150 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a403f99f-cf8c-43ce-91f8-4976707ea483', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.137615', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6db23bfe-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': '6c48a94920d0213268430a5cc86cbc02cb0fb30f420f90f88f8ff10d2c1e8497'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.137615', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6db24edc-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': 'ab61ed7628a9bbe58b4201933bb5cce59891bef55808c9c09ffdcd4155338ccc'}]}, 'timestamp': '2026-02-23 09:52:56.151093', '_unique_id': 'c0c110997904434c8c24adf7576e625d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.154 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.154 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e584afda-a9e7-4d91-afb6-8fd389c5d023', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.154149', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6db2da3c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': '978ff5a49e58209fba251079a9a93872994fdca4d7b9b72556068484ed874b93'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.154149', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6db2ea9a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': '823ff808c19f2c5d7876cba172d18e6052b268bce3e6c3e76efb64d2b48058b8'}]}, 'timestamp': '2026-02-23 09:52:56.155073', '_unique_id': 'fb21675608c143739754fa1b19fd4a90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.156 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.185 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.186 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6621c95f-3213-4f48-b4f9-bc5de10fa92b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.157306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6db7b796-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '841ac17f2e952ee0c776cbc5f4fca4f8b742307ab3a45caa419a2fc9850d3c82'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.157306', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6db7c998-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '77983e0af840e6b53e233aac95af7cb465a42e4759328cb942948795466fe07b'}]}, 'timestamp': '2026-02-23 09:52:56.187020', '_unique_id': 'd58893ee9fa746be8389dc0fefba112c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.188 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.192 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '386f5ba1-ed5d-48ff-94a9-37f9a653ea18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.189684', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6db8c366-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'c6e76c09462620436629b8e16153a3547f98604c3471bd296d7253a032ffc0cd'}]}, 'timestamp': '2026-02-23 09:52:56.193352', '_unique_id': '9e72e5f3bb1a4dd68b8dbe46ecd1397b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.194 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.195 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.196 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d502be5-94f9-4da0-94bc-a977af167870', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.195577', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6db92d06-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '2e4d1cba71708a7d3f8d8fcdd2e27ab4ce62f9bee3bc5d96fa2150967c5b986c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.195577', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6db942fa-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '42d8fab9ba00abc825586744d7b692aceebc44f0ba23935fb5de0711b4bb182a'}]}, 'timestamp': '2026-02-23 09:52:56.196589', '_unique_id': '85cf677e87d640b7b344ab345ad7548e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.197 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd99a8241-67db-48a3-8177-ea9b24b84302', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.198830', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6db9ad62-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'c0995ccbf590bfb08538ffdc6443a0c5cc4ee5e01be8bcab81e915c76c3ea8b9'}]}, 'timestamp': '2026-02-23 09:52:56.199334', '_unique_id': 'cb73d5b15b3b488f8e1c51bdee72d250'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e18b8400-ca80-454e-8a4e-785b1e9a3b48', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.201764', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dba1fe0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '4546ee1f5d35d0115e92a3b0572b8878211c4071daaffb6505c870f0c2bc7a1f'}]}, 'timestamp': '2026-02-23 09:52:56.202264', '_unique_id': 'f91f54c49b49492a9cb657651d89aef7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b96b9796-14b3-4ad5-8600-298cdb5868da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.204373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dba83fe-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '76bd46dc750382bf2e87974ee1bc60b1a24a8de96219a290b7a23ab2db583875'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.204373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dba972c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '9b06c6d9b51409139bb9a015e53c5eec2372085d0c942f7debcc2538c26150c0'}]}, 'timestamp': '2026-02-23 09:52:56.205291', '_unique_id': 'd3c5402438564833b157a5b553ef1b8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5acfa468-eee8-4f15-824b-dba0e794d0f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.207513', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dbaff0a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '7d3accf3f0a7e28b289d4052378e384c3abd10c003771e77171354757fa7a68e'}]}, 'timestamp': '2026-02-23 09:52:56.208013', '_unique_id': '894671bdf0e34bd592aea32f98ae5d27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dc1271d-da02-42de-8439-8f52324b0d10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.210143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dbb6576-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '14ca091eb53f86ae5c1411dff702a54cc1a7f0780b4240a1a4993d5274312dc0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.210143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dbb75f2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '1eaa722fd9381772b6afbfb5b6338be14f238ba0543205d0e3ff1286f416f3dc'}]}, 'timestamp': '2026-02-23 09:52:56.211029', '_unique_id': 'cd304300f34c4b18918d532d418d6751'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.211 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.213 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd85fe9d-449b-480a-b853-915e5f1380f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.213198', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dbbdcf4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'b1def77732d128eb3dc21dde6449edfda377876fe628b6406325e7aa1fb6f8a7'}]}, 'timestamp': '2026-02-23 09:52:56.213655', '_unique_id': '509774196d13445cb4d06d9616704c56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.214 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.215 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.215 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.216 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0df52c24-b4a8-4e5d-a2e5-bfc2679fa5ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.215740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dbc416c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '2147ae0344d6b13e64ac5fc95040f4e9439125e9a0561f4660b8a31a39646540'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.215740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dbc542c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '431874d2453e2d6a570547692b73242f03d8db7cfb42313e58c6b0d404a1de0b'}]}, 'timestamp': '2026-02-23 09:52:56.216716', '_unique_id': '18ea33ae83894cd89294216ba1a3ae7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.217 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14474806-b793-4bf5-b24b-2cbd43f16ace', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.218823', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dbcb9da-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '9f24032eb2ed2d3baad3631733c6dc2b8c28b5d44554c4ddf34056268cb28c1d'}]}, 'timestamp': '2026-02-23 09:52:56.219309', '_unique_id': '64d5cfda42574d84bffad74186b70a60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.220 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.221 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a4f49d3-e09c-4136-93a1-a4a78d032e81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:52:56.221393', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6dbfc72e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.428154928, 'message_signature': 'a8262a8b95bf422dbb974a22fa63b2f5016202586facccf483c38a3577476998'}]}, 'timestamp': '2026-02-23 09:52:56.239307', '_unique_id': 'd13a3807f03147fbb26b5e2d28af895c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ead0b49-2547-46db-a196-25eef3141b34', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.241470', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dc02d22-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'c60d285185e7b1a6a5c74c1cbbde595997742586e99559e3536000156b6804ee'}]}, 'timestamp': '2026-02-23 09:52:56.241952', '_unique_id': '4e68a67fe8614e8eb4f5a8c9b116b509'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8da5cc68-f293-4cc9-b5bc-2161baf76c4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.244196', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dc097e4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '8e1c4baeca59843ce48eedba6abe47b85861581f9730ac7defa690d6e78634cb'}]}, 'timestamp': '2026-02-23 09:52:56.244659', '_unique_id': 'ad884156fec545b7a4fc4a1f4c5bba9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dacf9d90-83b4-4075-8727-b8d51a730c38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.246736', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dc0fbf8-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': '75c9d44e7c1500f54d3a2abf49c5b4f6532108328716542a2c2b708578b97a0c'}]}, 'timestamp': '2026-02-23 09:52:56.247214', '_unique_id': '74d7bd41780d48bb89d99b9f1f7e0c7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 12580000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '793bb3f7-8238-4dd8-8623-a3a628fe8345', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12580000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:52:56.249558', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6dc16908-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.428154928, 'message_signature': 'f0768357bb1628766dfa4fe85467acc72904e86a5c4502f83d6f8007a9c7bf6a'}]}, 'timestamp': '2026-02-23 09:52:56.250026', '_unique_id': '55a0d6a3f8f147c5b14ff51656a3d97b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28fb9d72-697c-45c5-9e71-d2d73b9de224', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:52:56.252119', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6dc1cd12-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.379190088, 'message_signature': 'b8bc347771d4511549327bb73ad0e9f3f0d33903280a702b9cf37224fb7b819e'}]}, 'timestamp': '2026-02-23 09:52:56.252571', '_unique_id': '00c0abcd65514d51a3cc4d5269642e38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2b644b6-55d9-4c1c-a679-e7b8cd8c923a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.254821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dc238c4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': 'ebd2a0487b074eabd1af949976d0a6659ae72de830e4cd9c7b85f87891cf3dfd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.254821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dc2492c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.32709778, 'message_signature': '6868fbebe194e89cf1fc9e3d4445bcd321ba23f21b7e802bc25e8a63ffa53baf'}]}, 'timestamp': '2026-02-23 09:52:56.255722', '_unique_id': '52c830864f4e41cdad9e7c3c7a12a2f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c35896ca-df79-4f07-b2d0-3a9fdcdc8d03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:52:56.257862', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6dc2aeda-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '5d68c482bae114b9820a8f95d5a1d2c847f33ce90a3dc89bf87cce5433e3fcc8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:52:56.257862', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6dc2bf24-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11816.346789914, 'message_signature': '489d8e93b6dbf0c69486950b13ba647544ce93b6611302468d9bd2a0b42f6d62'}]}, 'timestamp': '2026-02-23 09:52:56.258737', '_unique_id': 'db876bfeae4e4e4e9877c87039a49a1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.259 12 ERROR oslo_messaging.notify.messaging Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:52:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:52:56.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:52:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:52:56 localhost nova_compute[282206]: 2026-02-23 09:52:56.938 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:52:56 localhost nova_compute[282206]: 2026-02-23 09:52:56.941 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:53:01 localhost podman[308149]: 2026-02-23 09:53:01.906916666 +0000 UTC m=+0.081670696 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:53:01 localhost nova_compute[282206]: 2026-02-23 09:53:01.940 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:01 localhost podman[308150]: 2026-02-23 09:53:01.966158842 +0000 UTC m=+0.138012243 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:53:01 localhost podman[308149]: 2026-02-23 09:53:01.969813474 +0000 UTC m=+0.144567484 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.build-date=20260216, managed_by=edpm_ansible, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:53:01 localhost podman[308150]: 2026-02-23 09:53:01.973522577 +0000 UTC m=+0.145375968 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:53:01 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:53:02 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:53:04 localhost sshd[308197]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:53:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:53:04 localhost podman[308199]: 2026-02-23 09:53:04.910571623 +0000 UTC m=+0.086631497 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Feb 23 04:53:04 localhost podman[308199]: 2026-02-23 09:53:04.924311434 +0000 UTC m=+0.100371328 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 23 04:53:04 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:53:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:06 localhost nova_compute[282206]: 2026-02-23 09:53:06.943 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:07 localhost ovn_controller[157695]: 2026-02-23T09:53:07Z|00078|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory Feb 23 04:53:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:08.536 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:53:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:08.537 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:53:08 localhost nova_compute[282206]: 2026-02-23 09:53:08.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:53:08 localhost podman[308219]: 2026-02-23 09:53:08.903094557 +0000 UTC m=+0.078190118 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:53:08 localhost podman[308219]: 2026-02-23 09:53:08.936550033 +0000 UTC m=+0.111645594 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216) Feb 23 04:53:08 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:53:09 localhost podman[242954]: time="2026-02-23T09:53:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:53:09 localhost podman[242954]: @ - - [23/Feb/2026:09:53:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 04:53:09 localhost podman[242954]: @ - - [23/Feb/2026:09:53:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18798 "" "Go-http-client/1.1" Feb 23 04:53:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0. Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.318457) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40 Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391318495, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 1188, "num_deletes": 256, "total_data_size": 1290539, "memory_usage": 1312240, "flush_reason": "Manual Compaction"} Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391327057, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 1267672, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22845, "largest_seqno": 24032, "table_properties": {"data_size": 1262533, "index_size": 2610, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11142, "raw_average_key_size": 19, "raw_value_size": 1251999, "raw_average_value_size": 2188, "num_data_blocks": 116, "num_entries": 572, "num_filter_entries": 572, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840296, "oldest_key_time": 1771840296, "file_creation_time": 1771840391, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}} Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 8647 microseconds, and 3883 cpu microseconds. Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.327104) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 1267672 bytes OK Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.327128) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331161) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331179) EVENT_LOG_v1 {"time_micros": 1771840391331173, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331199) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 1285123, prev total WAL file size 1285447, number of live WAL files 2. Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331916) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373638' seq:72057594037927935, type:22 .. '6C6F676D0034303230' seq:0, type:0; will stop at (end) Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(1237KB)], [39(16MB)] Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391331979, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 18618438, "oldest_snapshot_seqno": -1} Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12090 keys, 18507129 bytes, temperature: kUnknown Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391439850, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 18507129, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18438120, "index_size": 37676, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30277, "raw_key_size": 325082, "raw_average_key_size": 26, "raw_value_size": 18231880, "raw_average_value_size": 1508, "num_data_blocks": 1437, "num_entries": 12090, "num_filter_entries": 12090, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840391, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}} Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.440227) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 18507129 bytes Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.441846) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.4 rd, 171.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 16.5 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(29.3) write-amplify(14.6) OK, records in: 12626, records dropped: 536 output_compression: NoCompression Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.441897) EVENT_LOG_v1 {"time_micros": 1771840391441863, "job": 22, "event": "compaction_finished", "compaction_time_micros": 107996, "compaction_time_cpu_micros": 51197, "output_level": 6, "num_output_files": 1, "total_output_size": 18507129, "num_input_records": 12626, "num_output_records": 12090, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391442210, "job": 22, "event": "table_file_deletion", "file_number": 41} Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840391444741, "job": 22, "event": "table_file_deletion", "file_number": 39} Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.331780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444860) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444892) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444895) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:53:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:53:11.444898) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:53:11 localhost sshd[308237]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:53:11 localhost nova_compute[282206]: 2026-02-23 09:53:11.945 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:13 localhost openstack_network_exporter[245358]: ERROR 09:53:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:53:13 localhost openstack_network_exporter[245358]: Feb 23 04:53:13 localhost openstack_network_exporter[245358]: ERROR 09:53:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:53:13 localhost openstack_network_exporter[245358]: Feb 23 04:53:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 04:53:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 04:53:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 04:53:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 04:53:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 04:53:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 04:53:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:14 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:14.540 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:53:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:53:15 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:15 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:53:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:16 localhost nova_compute[282206]: 2026-02-23 09:53:16.947 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:53:16 localhost nova_compute[282206]: 2026-02-23 09:53:16.948 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:16 localhost nova_compute[282206]: 2026-02-23 09:53:16.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:53:16 localhost nova_compute[282206]: 2026-02-23 09:53:16.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:53:16 localhost nova_compute[282206]: 2026-02-23 09:53:16.950 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:53:16 localhost nova_compute[282206]: 2026-02-23 09:53:16.953 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:18 localhost nova_compute[282206]: 2026-02-23 09:53:18.489 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:18 localhost nova_compute[282206]: 2026-02-23 09:53:18.489 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:53:18 localhost nova_compute[282206]: 2026-02-23 09:53:18.490 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:53:18 localhost nova_compute[282206]: 2026-02-23 09:53:18.654 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:53:18 localhost nova_compute[282206]: 2026-02-23 09:53:18.655 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:53:18 localhost nova_compute[282206]: 2026-02-23 09:53:18.655 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:53:18 localhost nova_compute[282206]: 2026-02-23 09:53:18.656 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:53:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:53:18 localhost systemd[1]: tmp-crun.Hhwyue.mount: Deactivated successfully. Feb 23 04:53:18 localhost podman[308381]: 2026-02-23 09:53:18.923071214 +0000 UTC m=+0.094691184 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:53:18 localhost podman[308381]: 2026-02-23 09:53:18.935262187 +0000 UTC m=+0.106882167 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:53:18 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:53:19 localhost nova_compute[282206]: 2026-02-23 09:53:19.093 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:53:19 localhost nova_compute[282206]: 2026-02-23 09:53:19.145 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:53:19 localhost nova_compute[282206]: 2026-02-23 09:53:19.146 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:53:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:53:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:21 localhost nova_compute[282206]: 2026-02-23 09:53:21.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:21 localhost nova_compute[282206]: 2026-02-23 09:53:21.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:21 localhost nova_compute[282206]: 2026-02-23 09:53:21.074 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:21 localhost nova_compute[282206]: 2026-02-23 09:53:21.075 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:21 localhost nova_compute[282206]: 2026-02-23 09:53:21.075 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:53:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:53:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:53:21 localhost podman[308404]: 2026-02-23 09:53:21.947919671 +0000 UTC m=+0.076862448 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git) Feb 23 04:53:21 localhost nova_compute[282206]: 2026-02-23 09:53:21.955 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:21 localhost podman[308404]: 2026-02-23 09:53:21.96357699 +0000 UTC m=+0.092519737 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, release=1770267347, version=9.7, vcs-type=git, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64) Feb 23 04:53:21 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:53:22 localhost nova_compute[282206]: 2026-02-23 09:53:22.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:23 localhost nova_compute[282206]: 2026-02-23 09:53:23.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:24 localhost nova_compute[282206]: 2026-02-23 09:53:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.076 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.076 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:53:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:53:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2615064459' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.526 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.587 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.587 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.781 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.782 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11453MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.782 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.782 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.858 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.859 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.859 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.904 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:53:26 localhost nova_compute[282206]: 2026-02-23 09:53:26.956 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:53:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:53:27 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2212968234' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:53:27 localhost nova_compute[282206]: 2026-02-23 09:53:27.333 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:53:27 localhost nova_compute[282206]: 2026-02-23 09:53:27.341 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:53:27 localhost nova_compute[282206]: 2026-02-23 09:53:27.588 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:53:27 localhost nova_compute[282206]: 2026-02-23 09:53:27.591 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:53:27 localhost nova_compute[282206]: 2026-02-23 09:53:27.591 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:53:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:32 localhost nova_compute[282206]: 2026-02-23 09:53:31.960 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:53:32 localhost nova_compute[282206]: 2026-02-23 09:53:31.962 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:53:32 localhost nova_compute[282206]: 2026-02-23 09:53:31.963 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:53:32 localhost nova_compute[282206]: 2026-02-23 09:53:31.963 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:53:32 localhost nova_compute[282206]: 2026-02-23 09:53:32.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:32 localhost nova_compute[282206]: 2026-02-23 09:53:32.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:53:32 localhost podman[308472]: 2026-02-23 09:53:32.922166071 +0000 UTC m=+0.089649930 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:53:32 localhost podman[308471]: 2026-02-23 09:53:32.963811268 +0000 UTC m=+0.133944698 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0) Feb 23 04:53:32 localhost podman[308472]: 2026-02-23 09:53:32.983948205 +0000 UTC m=+0.151432084 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:53:32 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:53:33 localhost podman[308471]: 2026-02-23 09:53:33.030322897 +0000 UTC m=+0.200456297 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true) Feb 23 04:53:33 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:53:35 localhost nova_compute[282206]: 2026-02-23 09:53:35.755 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:53:35 localhost systemd[1]: tmp-crun.3qLLqj.mount: Deactivated successfully. Feb 23 04:53:35 localhost podman[308520]: 2026-02-23 09:53:35.922845848 +0000 UTC m=+0.093491377 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:53:35 localhost podman[308520]: 2026-02-23 09:53:35.95812505 +0000 UTC m=+0.128770569 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216) Feb 23 04:53:35 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:53:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:37 localhost nova_compute[282206]: 2026-02-23 09:53:37.311 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:37 localhost nova_compute[282206]: 2026-02-23 09:53:37.859 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:39 localhost podman[242954]: time="2026-02-23T09:53:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:53:39 localhost podman[242954]: @ - - [23/Feb/2026:09:53:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 04:53:39 localhost podman[242954]: @ - - [23/Feb/2026:09:53:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18798 "" "Go-http-client/1.1" Feb 23 04:53:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:53:39 localhost podman[308538]: 2026-02-23 09:53:39.905238473 +0000 UTC m=+0.082410368 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true) Feb 23 04:53:39 localhost podman[308538]: 2026-02-23 09:53:39.938353728 +0000 UTC m=+0.115525583 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Feb 23 04:53:39 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:53:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:42 localhost nova_compute[282206]: 2026-02-23 09:53:42.314 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:42 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:42.806 265541 INFO neutron.agent.linux.ip_lib [None req-3eef3e80-071c-4985-92db-d392e5623e8f - - - - - -] Device tapff39110f-d5 cannot be used as it has no MAC address#033[00m Feb 23 04:53:42 localhost nova_compute[282206]: 2026-02-23 09:53:42.862 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:42 localhost kernel: device tapff39110f-d5 entered promiscuous mode Feb 23 04:53:42 localhost ovn_controller[157695]: 2026-02-23T09:53:42Z|00079|binding|INFO|Claiming lport ff39110f-d5ab-4f4c-b656-11139ee6c196 for this chassis. Feb 23 04:53:42 localhost nova_compute[282206]: 2026-02-23 09:53:42.869 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:42 localhost ovn_controller[157695]: 2026-02-23T09:53:42Z|00080|binding|INFO|ff39110f-d5ab-4f4c-b656-11139ee6c196: Claiming unknown Feb 23 04:53:42 localhost NetworkManager[5974]: [1771840422.8709] manager: (tapff39110f-d5): new Generic device (/org/freedesktop/NetworkManager/Devices/18) Feb 23 04:53:42 localhost systemd-udevd[308566]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:53:42 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:42.890 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b3238cd9-9eb9-4ae1-bb2b-833536c18deb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3238cd9-9eb9-4ae1-bb2b-833536c18deb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02917a1d904f4889b9e244e1ebfc57ca', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3fe629c8-1dc0-4c84-9b5b-6b0d444ac4ee, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ff39110f-d5ab-4f4c-b656-11139ee6c196) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:53:42 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:42.893 163572 INFO neutron.agent.ovn.metadata.agent [-] Port ff39110f-d5ab-4f4c-b656-11139ee6c196 in datapath b3238cd9-9eb9-4ae1-bb2b-833536c18deb bound to our chassis#033[00m Feb 23 04:53:42 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:42.897 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port a8a7da51-0c94-4438-a1bc-0b45c4adff92 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 23 04:53:42 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:42.898 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3238cd9-9eb9-4ae1-bb2b-833536c18deb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:53:42 localhost journal[231253]: ethtool ioctl error on tapff39110f-d5: No such device Feb 23 04:53:42 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:42.899 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d789b269-be6e-4aad-b1e7-15938a37f976]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:53:42 localhost journal[231253]: ethtool ioctl error on tapff39110f-d5: No such device Feb 23 04:53:42 localhost ovn_controller[157695]: 2026-02-23T09:53:42Z|00081|binding|INFO|Setting lport ff39110f-d5ab-4f4c-b656-11139ee6c196 ovn-installed in OVS Feb 23 04:53:42 localhost ovn_controller[157695]: 2026-02-23T09:53:42Z|00082|binding|INFO|Setting lport ff39110f-d5ab-4f4c-b656-11139ee6c196 up in Southbound Feb 23 04:53:42 localhost nova_compute[282206]: 2026-02-23 09:53:42.908 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:42 localhost journal[231253]: ethtool ioctl error on tapff39110f-d5: No such device Feb 23 04:53:42 localhost journal[231253]: ethtool ioctl error on tapff39110f-d5: No such device Feb 23 04:53:42 localhost journal[231253]: ethtool ioctl error on tapff39110f-d5: No such device Feb 23 04:53:42 localhost journal[231253]: ethtool ioctl error on tapff39110f-d5: No such device Feb 23 04:53:42 localhost journal[231253]: ethtool ioctl error on tapff39110f-d5: No such device Feb 23 04:53:42 localhost journal[231253]: ethtool ioctl error on tapff39110f-d5: No such device Feb 23 04:53:42 localhost nova_compute[282206]: 2026-02-23 09:53:42.942 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:42 localhost nova_compute[282206]: 2026-02-23 09:53:42.970 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:43 localhost openstack_network_exporter[245358]: ERROR 09:53:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:53:43 localhost openstack_network_exporter[245358]: Feb 23 04:53:43 localhost openstack_network_exporter[245358]: ERROR 09:53:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:53:43 localhost openstack_network_exporter[245358]: Feb 23 04:53:43 localhost nova_compute[282206]: 2026-02-23 09:53:43.562 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:43 localhost podman[308638]: Feb 23 04:53:43 localhost podman[308638]: 2026-02-23 09:53:43.807470459 +0000 UTC m=+0.084792782 container create b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 23 04:53:43 localhost systemd[1]: Started libpod-conmon-b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35.scope. Feb 23 04:53:43 localhost systemd[1]: Started libcrun container. Feb 23 04:53:43 localhost podman[308638]: 2026-02-23 09:53:43.767007257 +0000 UTC m=+0.044329651 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:53:43 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aecf83ec51d273f51dfbd2c3054b8e6328f27606ae999dc013ec2a626bc3eb75/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:53:43 localhost podman[308638]: 2026-02-23 09:53:43.880353343 +0000 UTC m=+0.157675656 container init b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:53:43 localhost podman[308638]: 2026-02-23 09:53:43.889450982 +0000 UTC m=+0.166773305 container start b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS) Feb 23 04:53:43 localhost dnsmasq[308657]: started, version 2.85 cachesize 150 Feb 23 04:53:43 localhost dnsmasq[308657]: DNS service limited to local subnets Feb 23 04:53:43 localhost dnsmasq[308657]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:53:43 localhost dnsmasq[308657]: warning: no upstream servers configured Feb 23 04:53:43 localhost dnsmasq-dhcp[308657]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:53:43 localhost dnsmasq[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/addn_hosts - 0 addresses Feb 23 04:53:43 localhost dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/host Feb 23 04:53:43 localhost dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/opts Feb 23 04:53:44 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:44.026 265541 INFO neutron.agent.dhcp.agent [None req-dab2c9f8-4c4d-41a8-847d-d075e54f936e - - - - - -] DHCP configuration for ports {'2934a6f5-43b8-45c4-9f75-838564def4b3'} is completed#033[00m Feb 23 04:53:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e88 do_prune osdmap full prune enabled Feb 23 04:53:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e89 e89: 6 total, 6 up, 6 in Feb 23 04:53:44 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Feb 23 04:53:44 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:44.305 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:53:43Z, description=, device_id=6ce1665d-3eb2-47a2-bfdb-37d82bbd1318, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=af44d09a-4063-4295-a15e-3aae2c1b49de, ip_allocation=immediate, mac_address=fa:16:3e:25:86:49, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:53:40Z, description=, dns_domain=, id=b3238cd9-9eb9-4ae1-bb2b-833536c18deb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-800312373-network, port_security_enabled=True, project_id=02917a1d904f4889b9e244e1ebfc57ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4993, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=372, status=ACTIVE, subnets=['376bac44-58af-4a22-9c21-46b42c4d09e0'], tags=[], tenant_id=02917a1d904f4889b9e244e1ebfc57ca, updated_at=2026-02-23T09:53:41Z, vlan_transparent=None, network_id=b3238cd9-9eb9-4ae1-bb2b-833536c18deb, port_security_enabled=False, project_id=02917a1d904f4889b9e244e1ebfc57ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=380, status=DOWN, tags=[], tenant_id=02917a1d904f4889b9e244e1ebfc57ca, updated_at=2026-02-23T09:53:44Z on network b3238cd9-9eb9-4ae1-bb2b-833536c18deb#033[00m Feb 23 04:53:44 localhost dnsmasq[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/addn_hosts - 1 addresses Feb 23 04:53:44 localhost dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/host Feb 23 04:53:44 localhost dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/opts Feb 23 04:53:44 localhost podman[308675]: 2026-02-23 09:53:44.561053072 +0000 UTC m=+0.066197161 container kill b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:53:44 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:44.801 265541 INFO neutron.agent.dhcp.agent [None req-9eadebd3-0920-4ae1-8127-d9fe2ae8b8fc - - - - - -] DHCP configuration for ports {'af44d09a-4063-4295-a15e-3aae2c1b49de'} is completed#033[00m Feb 23 04:53:45 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:45.601 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:53:43Z, description=, device_id=6ce1665d-3eb2-47a2-bfdb-37d82bbd1318, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=af44d09a-4063-4295-a15e-3aae2c1b49de, ip_allocation=immediate, mac_address=fa:16:3e:25:86:49, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:53:40Z, description=, dns_domain=, id=b3238cd9-9eb9-4ae1-bb2b-833536c18deb, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveAutoBlockMigrationV225Test-800312373-network, port_security_enabled=True, project_id=02917a1d904f4889b9e244e1ebfc57ca, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4993, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=372, status=ACTIVE, subnets=['376bac44-58af-4a22-9c21-46b42c4d09e0'], tags=[], tenant_id=02917a1d904f4889b9e244e1ebfc57ca, updated_at=2026-02-23T09:53:41Z, vlan_transparent=None, network_id=b3238cd9-9eb9-4ae1-bb2b-833536c18deb, port_security_enabled=False, project_id=02917a1d904f4889b9e244e1ebfc57ca, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=380, status=DOWN, tags=[], tenant_id=02917a1d904f4889b9e244e1ebfc57ca, updated_at=2026-02-23T09:53:44Z on network b3238cd9-9eb9-4ae1-bb2b-833536c18deb#033[00m Feb 23 04:53:45 localhost dnsmasq[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/addn_hosts - 1 addresses Feb 23 04:53:45 localhost dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/host Feb 23 04:53:45 localhost dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/opts Feb 23 04:53:45 localhost podman[308713]: 2026-02-23 09:53:45.828602403 +0000 UTC m=+0.064098256 container kill b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:53:45 localhost systemd[1]: tmp-crun.vv3bTf.mount: Deactivated successfully. Feb 23 04:53:46 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:46.065 265541 INFO neutron.agent.dhcp.agent [None req-e8fd42cd-b4ad-4e40-a14c-7297c64ce0e8 - - - - - -] DHCP configuration for ports {'af44d09a-4063-4295-a15e-3aae2c1b49de'} is completed#033[00m Feb 23 04:53:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e89 do_prune osdmap full prune enabled Feb 23 04:53:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e90 e90: 6 total, 6 up, 6 in Feb 23 04:53:46 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in Feb 23 04:53:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:47 localhost nova_compute[282206]: 2026-02-23 09:53:47.319 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:48 localhost neutron_sriov_agent[258207]: 2026-02-23 09:53:48.150 2 INFO neutron.agent.securitygroups_rpc [None req-a2d5984e-7e32-490c-a625-105e3b4d8b68 0b7edff084ac4cda88d2d8f5182da779 b5e1135ba2724a69b072bbda0ea8476c - - default default] Security group member updated ['5e2da0ff-f592-42de-9188-06e3b0bca61b']#033[00m Feb 23 04:53:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:53:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:53:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:53:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e90 do_prune osdmap full prune enabled Feb 23 04:53:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e91 e91: 6 total, 6 up, 6 in Feb 23 04:53:48 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e91: 6 total, 6 up, 6 in Feb 23 04:53:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:53:49 localhost podman[308736]: 2026-02-23 09:53:49.904977948 +0000 UTC m=+0.082573642 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:53:49 localhost podman[308736]: 2026-02-23 09:53:49.917300127 +0000 UTC m=+0.094895811 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:53:49 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:53:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e91 do_prune osdmap full prune enabled Feb 23 04:53:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e92 e92: 6 total, 6 up, 6 in Feb 23 04:53:50 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in Feb 23 04:53:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:50.772 265541 INFO neutron.agent.linux.ip_lib [None req-8a2a38ed-38fa-49f4-b9b5-b7bd9ebf4b88 - - - - - -] Device tap4cbf3d42-6e cannot be used as it has no MAC address#033[00m Feb 23 04:53:50 localhost nova_compute[282206]: 2026-02-23 09:53:50.833 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:50 localhost kernel: device tap4cbf3d42-6e entered promiscuous mode Feb 23 04:53:50 localhost NetworkManager[5974]: [1771840430.8404] manager: (tap4cbf3d42-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Feb 23 04:53:50 localhost nova_compute[282206]: 2026-02-23 09:53:50.840 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:50 localhost ovn_controller[157695]: 2026-02-23T09:53:50Z|00083|binding|INFO|Claiming lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 for this chassis. Feb 23 04:53:50 localhost ovn_controller[157695]: 2026-02-23T09:53:50Z|00084|binding|INFO|4cbf3d42-6ec8-4d67-8923-c66e0247fcd0: Claiming unknown Feb 23 04:53:50 localhost systemd-udevd[308770]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:53:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:50.849 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c4367d4b-271d-4a28-a878-d77074456171', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4367d4b-271d-4a28-a878-d77074456171', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e1135ba2724a69b072bbda0ea8476c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57c5c75f-3246-4a64-87cf-649ab7e0f2d0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4cbf3d42-6ec8-4d67-8923-c66e0247fcd0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:53:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:50.851 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 in datapath c4367d4b-271d-4a28-a878-d77074456171 bound to our chassis#033[00m Feb 23 04:53:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:50.853 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4367d4b-271d-4a28-a878-d77074456171 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:53:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:53:50.853 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[8756a610-6bdf-41f3-aa62-a078d450e9d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:53:50 localhost journal[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device Feb 23 04:53:50 localhost ovn_controller[157695]: 2026-02-23T09:53:50Z|00085|binding|INFO|Setting lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 ovn-installed in OVS Feb 23 04:53:50 localhost ovn_controller[157695]: 2026-02-23T09:53:50Z|00086|binding|INFO|Setting lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 up in Southbound Feb 23 04:53:50 localhost nova_compute[282206]: 2026-02-23 09:53:50.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:50 localhost journal[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device Feb 23 04:53:50 localhost journal[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device Feb 23 04:53:50 localhost journal[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device Feb 23 04:53:50 localhost journal[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device Feb 23 04:53:50 localhost journal[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device Feb 23 04:53:50 localhost journal[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device Feb 23 04:53:50 localhost journal[231253]: ethtool ioctl error on tap4cbf3d42-6e: No such device Feb 23 04:53:50 localhost nova_compute[282206]: 2026-02-23 09:53:50.908 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:50 localhost nova_compute[282206]: 2026-02-23 09:53:50.936 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:51 localhost nova_compute[282206]: 2026-02-23 09:53:51.594 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:51 localhost podman[308841]: Feb 23 04:53:51 localhost podman[308841]: 2026-02-23 09:53:51.801183483 +0000 UTC m=+0.089448293 container create c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0) Feb 23 04:53:51 localhost systemd[1]: Started libpod-conmon-c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84.scope. Feb 23 04:53:51 localhost podman[308841]: 2026-02-23 09:53:51.760235518 +0000 UTC m=+0.048500358 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:53:51 localhost systemd[1]: Started libcrun container. Feb 23 04:53:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e813bd4a46c7971615ac88ffd7ed7c4cab04c48395cbaf4f846b5efffe82a9e8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:53:51 localhost podman[308841]: 2026-02-23 09:53:51.881078763 +0000 UTC m=+0.169343563 container init c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 23 04:53:51 localhost podman[308841]: 2026-02-23 09:53:51.889601294 +0000 UTC m=+0.177866094 container start c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 23 04:53:51 localhost dnsmasq[308859]: started, version 2.85 cachesize 150 Feb 23 04:53:51 localhost dnsmasq[308859]: DNS service limited to local subnets Feb 23 04:53:51 localhost dnsmasq[308859]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:53:51 localhost dnsmasq[308859]: warning: no upstream servers configured Feb 23 04:53:51 localhost dnsmasq-dhcp[308859]: DHCP, static leases only on 19.80.0.0, lease time 1d Feb 23 04:53:51 localhost dnsmasq[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/addn_hosts - 0 addresses Feb 23 04:53:51 localhost dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/host Feb 23 04:53:51 localhost dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/opts Feb 23 04:53:52 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:52.073 265541 INFO neutron.agent.dhcp.agent [None req-590e9331-7138-4f5c-a26c-74331c2db684 - - - - - -] DHCP configuration for ports {'c580c9b8-a35b-42fb-bda8-24401f2a22e1'} is completed#033[00m Feb 23 04:53:52 localhost nova_compute[282206]: 2026-02-23 09:53:52.361 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:52 localhost neutron_sriov_agent[258207]: 2026-02-23 09:53:52.569 2 INFO neutron.agent.securitygroups_rpc [None req-92b63054-7ec2-4ccb-bee7-2b93ea819111 0b7edff084ac4cda88d2d8f5182da779 b5e1135ba2724a69b072bbda0ea8476c - - default default] Security group member updated ['5e2da0ff-f592-42de-9188-06e3b0bca61b']#033[00m Feb 23 04:53:52 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:52.619 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:53:52Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1fc7da92-c93a-4191-b374-5aef0705e0ce, ip_allocation=immediate, mac_address=fa:16:3e:77:6d:80, name=tempest-subport-1525156145, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:53:48Z, description=, dns_domain=, id=c4367d4b-271d-4a28-a878-d77074456171, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-631042675, port_security_enabled=True, project_id=b5e1135ba2724a69b072bbda0ea8476c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63828, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=419, status=ACTIVE, subnets=['7dfbfbc0-d83e-4729-b852-8f17e8a182f9'], tags=[], tenant_id=b5e1135ba2724a69b072bbda0ea8476c, updated_at=2026-02-23T09:53:49Z, vlan_transparent=None, network_id=c4367d4b-271d-4a28-a878-d77074456171, port_security_enabled=True, project_id=b5e1135ba2724a69b072bbda0ea8476c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5e2da0ff-f592-42de-9188-06e3b0bca61b'], standard_attr_id=454, status=DOWN, tags=[], tenant_id=b5e1135ba2724a69b072bbda0ea8476c, updated_at=2026-02-23T09:53:52Z on network c4367d4b-271d-4a28-a878-d77074456171#033[00m Feb 23 04:53:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e92 do_prune osdmap full prune enabled Feb 23 04:53:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e93 e93: 6 total, 6 up, 6 in Feb 23 04:53:52 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e93: 6 total, 6 up, 6 in Feb 23 04:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:53:52 localhost systemd[1]: tmp-crun.2CaR5E.mount: Deactivated successfully. Feb 23 04:53:52 localhost podman[308877]: 2026-02-23 09:53:52.8559166 +0000 UTC m=+0.071465263 container kill c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:53:52 localhost dnsmasq[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/addn_hosts - 1 addresses Feb 23 04:53:52 localhost dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/host Feb 23 04:53:52 localhost dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/opts Feb 23 04:53:52 localhost podman[308888]: 2026-02-23 09:53:52.941800282 +0000 UTC m=+0.110643863 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.openshift.expose-services=, vcs-type=git) Feb 23 04:53:52 localhost podman[308888]: 2026-02-23 09:53:52.965219721 +0000 UTC m=+0.134063332 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter) Feb 23 04:53:52 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:53:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:53:53.148 265541 INFO neutron.agent.dhcp.agent [None req-e19dc4ce-495b-42fc-a50f-199680036686 - - - - - -] DHCP configuration for ports {'1fc7da92-c93a-4191-b374-5aef0705e0ce'} is completed#033[00m Feb 23 04:53:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e93 do_prune osdmap full prune enabled Feb 23 04:53:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e94 e94: 6 total, 6 up, 6 in Feb 23 04:53:53 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e94: 6 total, 6 up, 6 in Feb 23 04:53:53 localhost systemd[1]: tmp-crun.cZQya0.mount: Deactivated successfully. Feb 23 04:53:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:53:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e94 do_prune osdmap full prune enabled Feb 23 04:53:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e95 e95: 6 total, 6 up, 6 in Feb 23 04:53:56 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e95: 6 total, 6 up, 6 in Feb 23 04:53:56 localhost nova_compute[282206]: 2026-02-23 09:53:56.607 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:57 localhost nova_compute[282206]: 2026-02-23 09:53:57.094 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:57 localhost nova_compute[282206]: 2026-02-23 09:53:57.365 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:57 localhost nova_compute[282206]: 2026-02-23 09:53:57.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:53:58 localhost neutron_sriov_agent[258207]: 2026-02-23 09:53:58.479 2 INFO neutron.agent.securitygroups_rpc [req-eb0b5adf-b7bd-4216-9e46-c2d60917a5c7 req-02734a2f-bd2e-4435-92b8-64f041df35e7 c511c0d31bd1497ea63920bacbc29b16 bba12cc9382b485789a88c5fc615cc96 - - default default] Security group rule updated ['9bb94a7a-3596-41a3-a016-4ce3c9b7d984']#033[00m Feb 23 04:53:58 localhost neutron_sriov_agent[258207]: 2026-02-23 09:53:58.754 2 INFO neutron.agent.securitygroups_rpc [req-a40b7300-b557-4abd-b7da-db42c2c5294c req-3cf01c79-11ae-4df5-84a2-12d00520d629 c511c0d31bd1497ea63920bacbc29b16 bba12cc9382b485789a88c5fc615cc96 - - default default] Security group rule updated ['9bb94a7a-3596-41a3-a016-4ce3c9b7d984']#033[00m Feb 23 04:53:59 localhost sshd[308919]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:54:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e95 do_prune osdmap full prune enabled Feb 23 04:54:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e96 e96: 6 total, 6 up, 6 in Feb 23 04:54:01 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e96: 6 total, 6 up, 6 in Feb 23 04:54:01 localhost nova_compute[282206]: 2026-02-23 09:54:01.799 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:02 localhost nova_compute[282206]: 2026-02-23 09:54:02.367 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:02 localhost nova_compute[282206]: 2026-02-23 09:54:02.748 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:54:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:54:03 localhost podman[308921]: 2026-02-23 09:54:03.9289202 +0000 UTC m=+0.093753335 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216) Feb 23 04:54:04 localhost podman[308921]: 2026-02-23 09:54:04.015897656 +0000 UTC m=+0.180730751 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS) Feb 23 04:54:04 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:54:04 localhost podman[308922]: 2026-02-23 09:54:04.017180346 +0000 UTC m=+0.180525036 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:54:04 localhost podman[308922]: 2026-02-23 09:54:04.09784369 +0000 UTC m=+0.261188370 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:54:04 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:54:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:54:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e96 do_prune osdmap full prune enabled Feb 23 04:54:06 localhost podman[308970]: 2026-02-23 09:54:06.926502562 +0000 UTC m=+0.088849104 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:54:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e97 e97: 6 total, 6 up, 6 in Feb 23 04:54:06 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e97: 6 total, 6 up, 6 in Feb 23 04:54:06 localhost nova_compute[282206]: 2026-02-23 09:54:06.951 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:06 localhost podman[308970]: 2026-02-23 09:54:06.969589103 +0000 UTC m=+0.131935695 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:54:06 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:54:07 localhost nova_compute[282206]: 2026-02-23 09:54:07.371 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:08 localhost ovn_controller[157695]: 2026-02-23T09:54:08Z|00087|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:08 localhost nova_compute[282206]: 2026-02-23 09:54:08.239 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:08 localhost ovn_controller[157695]: 2026-02-23T09:54:08Z|00088|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:08 localhost nova_compute[282206]: 2026-02-23 09:54:08.370 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:08 localhost nova_compute[282206]: 2026-02-23 09:54:08.602 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:08.603 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:08.604 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:54:09 localhost podman[242954]: time="2026-02-23T09:54:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:54:09 localhost podman[242954]: @ - - [23/Feb/2026:09:54:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160727 "" "Go-http-client/1.1" Feb 23 04:54:09 localhost podman[242954]: @ - - [23/Feb/2026:09:54:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19762 "" "Go-http-client/1.1" Feb 23 04:54:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:54:10 localhost podman[308989]: 2026-02-23 09:54:10.915438237 +0000 UTC m=+0.088640908 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible) Feb 23 04:54:10 localhost podman[308989]: 2026-02-23 09:54:10.925119614 +0000 UTC m=+0.098322305 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:54:10 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:54:11 localhost ovn_controller[157695]: 2026-02-23T09:54:11Z|00089|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:11 localhost nova_compute[282206]: 2026-02-23 09:54:11.162 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:12 localhost nova_compute[282206]: 2026-02-23 09:54:12.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:12 localhost nova_compute[282206]: 2026-02-23 09:54:12.414 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:13 localhost openstack_network_exporter[245358]: ERROR 09:54:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:54:13 localhost openstack_network_exporter[245358]: Feb 23 04:54:13 localhost openstack_network_exporter[245358]: ERROR 09:54:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:54:13 localhost openstack_network_exporter[245358]: Feb 23 04:54:14 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:14.607 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:54:15 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:15.190 265541 INFO neutron.agent.linux.ip_lib [None req-16e3220b-2424-4022-a903-b5a2cd0d9f75 - - - - - -] Device tapc54c90dc-59 cannot be used as it has no MAC address#033[00m Feb 23 04:54:15 localhost nova_compute[282206]: 2026-02-23 09:54:15.215 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:15 localhost kernel: device tapc54c90dc-59 entered promiscuous mode Feb 23 04:54:15 localhost NetworkManager[5974]: [1771840455.2241] manager: (tapc54c90dc-59): new Generic device (/org/freedesktop/NetworkManager/Devices/20) Feb 23 04:54:15 localhost nova_compute[282206]: 2026-02-23 09:54:15.223 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:15 localhost ovn_controller[157695]: 2026-02-23T09:54:15Z|00090|binding|INFO|Claiming lport c54c90dc-59eb-4ba6-a441-5146f8224a2f for this chassis. Feb 23 04:54:15 localhost ovn_controller[157695]: 2026-02-23T09:54:15Z|00091|binding|INFO|c54c90dc-59eb-4ba6-a441-5146f8224a2f: Claiming unknown Feb 23 04:54:15 localhost systemd-udevd[309017]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:54:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:15.237 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-10e2e9cc-29ed-4970-84df-64c996e76871', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-10e2e9cc-29ed-4970-84df-64c996e76871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c5e1c9f6e8d451fa766e1133da7c78c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3abb40-d939-4e85-874a-e1c4e0aed9c9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c54c90dc-59eb-4ba6-a441-5146f8224a2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:15.240 163572 INFO neutron.agent.ovn.metadata.agent [-] Port c54c90dc-59eb-4ba6-a441-5146f8224a2f in datapath 10e2e9cc-29ed-4970-84df-64c996e76871 bound to our chassis#033[00m Feb 23 04:54:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:15.242 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 10e2e9cc-29ed-4970-84df-64c996e76871 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:54:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:15.243 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[85147674-c299-47ad-b38c-f2c2e39e26fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:54:15 localhost journal[231253]: ethtool ioctl error on tapc54c90dc-59: No such device Feb 23 04:54:15 localhost ovn_controller[157695]: 2026-02-23T09:54:15Z|00092|binding|INFO|Setting lport c54c90dc-59eb-4ba6-a441-5146f8224a2f ovn-installed in OVS Feb 23 04:54:15 localhost ovn_controller[157695]: 2026-02-23T09:54:15Z|00093|binding|INFO|Setting lport c54c90dc-59eb-4ba6-a441-5146f8224a2f up in Southbound Feb 23 04:54:15 localhost nova_compute[282206]: 2026-02-23 09:54:15.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:15 localhost journal[231253]: ethtool ioctl error on tapc54c90dc-59: No such device Feb 23 04:54:15 localhost journal[231253]: ethtool ioctl error on tapc54c90dc-59: No such device Feb 23 04:54:15 localhost journal[231253]: ethtool ioctl error on tapc54c90dc-59: No such device Feb 23 04:54:15 localhost journal[231253]: ethtool ioctl error on tapc54c90dc-59: No such device Feb 23 04:54:15 localhost journal[231253]: ethtool ioctl error on tapc54c90dc-59: No such device Feb 23 04:54:15 localhost journal[231253]: ethtool ioctl error on tapc54c90dc-59: No such device Feb 23 04:54:15 localhost journal[231253]: ethtool ioctl error on tapc54c90dc-59: No such device Feb 23 04:54:15 localhost nova_compute[282206]: 2026-02-23 09:54:15.298 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:15 localhost nova_compute[282206]: 2026-02-23 09:54:15.330 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:16 localhost podman[309153]: Feb 23 04:54:16 localhost podman[309153]: 2026-02-23 09:54:16.323243002 +0000 UTC m=+0.081427147 container create bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:54:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:54:16 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:54:16 localhost systemd[1]: Started libpod-conmon-bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c.scope. Feb 23 04:54:16 localhost podman[309153]: 2026-02-23 09:54:16.281593625 +0000 UTC m=+0.039777840 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:54:16 localhost systemd[1]: tmp-crun.ZPQzZI.mount: Deactivated successfully. Feb 23 04:54:16 localhost systemd[1]: Started libcrun container. Feb 23 04:54:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/edab4fa058864b90b5cf7f18b174b68d3de95b3e3a1dd659b3050d106daf7c15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:54:16 localhost podman[309153]: 2026-02-23 09:54:16.434830033 +0000 UTC m=+0.193014188 container init bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:54:16 localhost podman[309153]: 2026-02-23 09:54:16.443704816 +0000 UTC m=+0.201888951 container start bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:54:16 localhost dnsmasq[309191]: started, version 2.85 cachesize 150 Feb 23 04:54:16 localhost dnsmasq[309191]: DNS service limited to local subnets Feb 23 04:54:16 localhost dnsmasq[309191]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:54:16 localhost dnsmasq[309191]: warning: no upstream servers configured Feb 23 04:54:16 localhost dnsmasq-dhcp[309191]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:54:16 localhost dnsmasq[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/addn_hosts - 0 addresses Feb 23 04:54:16 localhost dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/host Feb 23 04:54:16 localhost dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/opts Feb 23 04:54:16 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:16.678 265541 INFO neutron.agent.dhcp.agent [None req-4c359eed-cf97-4fcd-b878-458d218022d9 - - - - - -] DHCP configuration for ports {'ebac5c61-7d67-4897-8f42-37105858c5d4'} is completed#033[00m Feb 23 04:54:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e97 do_prune osdmap full prune enabled Feb 23 04:54:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e98 e98: 6 total, 6 up, 6 in Feb 23 04:54:16 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e98: 6 total, 6 up, 6 in Feb 23 04:54:17 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:54:17 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:54:17 localhost nova_compute[282206]: 2026-02-23 09:54:17.417 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:17 localhost nova_compute[282206]: 2026-02-23 09:54:17.900 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:19 localhost nova_compute[282206]: 2026-02-23 09:54:19.480 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:19 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:19.547 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:54:19Z, description=, device_id=5cb19bc2-a554-4277-a22e-8a81f247e688, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f934b998-7206-4f75-a3ae-c758bf173f59, ip_allocation=immediate, mac_address=fa:16:3e:18:8c:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:54:13Z, description=, dns_domain=, id=10e2e9cc-29ed-4970-84df-64c996e76871, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-984997043-network, port_security_enabled=True, project_id=6c5e1c9f6e8d451fa766e1133da7c78c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=566, status=ACTIVE, subnets=['da529488-ae4a-474a-81ed-8a85a5e66a50'], tags=[], tenant_id=6c5e1c9f6e8d451fa766e1133da7c78c, updated_at=2026-02-23T09:54:14Z, vlan_transparent=None, network_id=10e2e9cc-29ed-4970-84df-64c996e76871, port_security_enabled=False, project_id=6c5e1c9f6e8d451fa766e1133da7c78c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=596, status=DOWN, tags=[], tenant_id=6c5e1c9f6e8d451fa766e1133da7c78c, updated_at=2026-02-23T09:54:19Z on network 10e2e9cc-29ed-4970-84df-64c996e76871#033[00m Feb 23 04:54:19 localhost systemd[1]: tmp-crun.m9n4gp.mount: Deactivated successfully. Feb 23 04:54:19 localhost dnsmasq[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/addn_hosts - 1 addresses Feb 23 04:54:19 localhost podman[309210]: 2026-02-23 09:54:19.787894712 +0000 UTC m=+0.068931674 container kill bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:54:19 localhost dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/host Feb 23 04:54:19 localhost dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/opts Feb 23 04:54:20 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:20.153 265541 INFO neutron.agent.dhcp.agent [None req-522110c1-716e-49e2-a0ca-2b71dcbf71a3 - - - - - -] DHCP configuration for ports {'f934b998-7206-4f75-a3ae-c758bf173f59'} is completed#033[00m Feb 23 04:54:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:54:20 localhost sshd[309232]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:54:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:54:20 localhost nova_compute[282206]: 2026-02-23 09:54:20.593 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:20 localhost nova_compute[282206]: 2026-02-23 09:54:20.594 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:54:20 localhost nova_compute[282206]: 2026-02-23 09:54:20.594 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:54:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:54:20 localhost podman[309233]: 2026-02-23 09:54:20.919543306 +0000 UTC m=+0.083722107 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:54:20 localhost podman[309233]: 2026-02-23 09:54:20.927996596 +0000 UTC m=+0.092175357 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:54:20 localhost nova_compute[282206]: 2026-02-23 09:54:20.935 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:54:20 localhost nova_compute[282206]: 2026-02-23 09:54:20.936 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:54:20 localhost nova_compute[282206]: 2026-02-23 09:54:20.936 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:54:20 localhost nova_compute[282206]: 2026-02-23 09:54:20.937 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:54:20 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:54:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:54:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e98 do_prune osdmap full prune enabled Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.445 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e99 e99: 6 total, 6 up, 6 in Feb 23 04:54:22 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e99: 6 total, 6 up, 6 in Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.525 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.550 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.550 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.550 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.551 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.551 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.552 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 23 04:54:22 localhost nova_compute[282206]: 2026-02-23 09:54:22.566 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:22 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:22.784 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:54:19Z, description=, device_id=5cb19bc2-a554-4277-a22e-8a81f247e688, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f934b998-7206-4f75-a3ae-c758bf173f59, ip_allocation=immediate, mac_address=fa:16:3e:18:8c:b2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:54:13Z, description=, dns_domain=, id=10e2e9cc-29ed-4970-84df-64c996e76871, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupsTestJSON-984997043-network, port_security_enabled=True, project_id=6c5e1c9f6e8d451fa766e1133da7c78c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=62679, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=566, status=ACTIVE, subnets=['da529488-ae4a-474a-81ed-8a85a5e66a50'], tags=[], tenant_id=6c5e1c9f6e8d451fa766e1133da7c78c, updated_at=2026-02-23T09:54:14Z, vlan_transparent=None, network_id=10e2e9cc-29ed-4970-84df-64c996e76871, port_security_enabled=False, project_id=6c5e1c9f6e8d451fa766e1133da7c78c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=596, status=DOWN, tags=[], tenant_id=6c5e1c9f6e8d451fa766e1133da7c78c, updated_at=2026-02-23T09:54:19Z on network 10e2e9cc-29ed-4970-84df-64c996e76871#033[00m Feb 23 04:54:23 localhost podman[309271]: 2026-02-23 09:54:23.024214693 +0000 UTC m=+0.060981641 container kill bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0) Feb 23 04:54:23 localhost dnsmasq[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/addn_hosts - 1 addresses Feb 23 04:54:23 localhost dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/host Feb 23 04:54:23 localhost dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/opts Feb 23 04:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:54:23 localhost nova_compute[282206]: 2026-02-23 09:54:23.080 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:23 localhost nova_compute[282206]: 2026-02-23 09:54:23.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:23 localhost nova_compute[282206]: 2026-02-23 09:54:23.081 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:23 localhost podman[309286]: 2026-02-23 09:54:23.108306821 +0000 UTC m=+0.058804784 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347) Feb 23 04:54:23 localhost podman[309286]: 2026-02-23 09:54:23.120649389 +0000 UTC m=+0.071147392 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:54:23 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:54:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:23.272 265541 INFO neutron.agent.dhcp.agent [None req-12ed2ead-5d81-43c4-a1b1-5026044fa5a1 - - - - - -] DHCP configuration for ports {'f934b998-7206-4f75-a3ae-c758bf173f59'} is completed#033[00m Feb 23 04:54:24 localhost nova_compute[282206]: 2026-02-23 09:54:24.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e99 do_prune osdmap full prune enabled Feb 23 04:54:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e100 e100: 6 total, 6 up, 6 in Feb 23 04:54:24 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e100: 6 total, 6 up, 6 in Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0. Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.834914) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43 Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464834960, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1229, "num_deletes": 254, "total_data_size": 1170312, "memory_usage": 1192328, "flush_reason": "Manual Compaction"} Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464844836, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1135721, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24033, "largest_seqno": 25261, "table_properties": {"data_size": 1130174, "index_size": 2954, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 13048, "raw_average_key_size": 21, "raw_value_size": 1118642, "raw_average_value_size": 1827, "num_data_blocks": 125, "num_entries": 612, "num_filter_entries": 612, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840391, "oldest_key_time": 1771840391, "file_creation_time": 1771840464, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}} Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 10005 microseconds, and 4168 cpu microseconds. Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.844916) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1135721 bytes OK Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.844984) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.847755) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.847778) EVENT_LOG_v1 {"time_micros": 1771840464847772, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.847799) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1164659, prev total WAL file size 1164659, number of live WAL files 2. Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.848608) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end) Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1109KB)], [42(17MB)] Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464848653, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 19642850, "oldest_snapshot_seqno": -1} Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12169 keys, 17877121 bytes, temperature: kUnknown Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464979818, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 17877121, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17808406, "index_size": 37190, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 327407, "raw_average_key_size": 26, "raw_value_size": 17601662, "raw_average_value_size": 1446, "num_data_blocks": 1414, "num_entries": 12169, "num_filter_entries": 12169, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840464, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}} Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.980173) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 17877121 bytes Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.982510) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.6 rd, 136.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 17.6 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(33.0) write-amplify(15.7) OK, records in: 12702, records dropped: 533 output_compression: NoCompression Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.982550) EVENT_LOG_v1 {"time_micros": 1771840464982532, "job": 24, "event": "compaction_finished", "compaction_time_micros": 131319, "compaction_time_cpu_micros": 46147, "output_level": 6, "num_output_files": 1, "total_output_size": 17877121, "num_input_records": 12702, "num_output_records": 12169, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464982856, "job": 24, "event": "table_file_deletion", "file_number": 44} Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840464985692, "job": 24, "event": "table_file_deletion", "file_number": 42} Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.848520) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985797) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985808) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:54:24 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:54:24.985814) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:54:25 localhost nova_compute[282206]: 2026-02-23 09:54:25.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.143 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.155 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.159 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.160 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.160 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.160 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.160 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:54:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:54:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2552967718' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.754 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.594s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.823 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.824 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:54:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.969 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.970 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11357MB free_disk=41.63432312011719GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.970 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:54:26 localhost nova_compute[282206]: 2026-02-23 09:54:26.970 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.190 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.190 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.191 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.360 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.530 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.533 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.535 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.607 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 04:54:27 localhost nova_compute[282206]: 2026-02-23 09:54:27.647 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 04:54:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e100 do_prune osdmap full prune enabled Feb 23 04:54:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e101 e101: 6 total, 6 up, 6 in Feb 23 04:54:28 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e101: 6 total, 6 up, 6 in Feb 23 04:54:28 localhost nova_compute[282206]: 2026-02-23 09:54:28.914 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:54:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:54:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2426814409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:54:29 localhost nova_compute[282206]: 2026-02-23 09:54:29.293 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.378s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:54:29 localhost nova_compute[282206]: 2026-02-23 09:54:29.299 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:54:29 localhost nova_compute[282206]: 2026-02-23 09:54:29.341 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:54:29 localhost nova_compute[282206]: 2026-02-23 09:54:29.380 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:54:29 localhost nova_compute[282206]: 2026-02-23 09:54:29.380 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:54:29 localhost nova_compute[282206]: 2026-02-23 09:54:29.381 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:29 localhost nova_compute[282206]: 2026-02-23 09:54:29.382 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 23 04:54:29 localhost nova_compute[282206]: 2026-02-23 09:54:29.417 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 23 04:54:30 localhost nova_compute[282206]: 2026-02-23 09:54:30.329 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:54:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:54:31 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2410368303' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:54:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e101 do_prune osdmap full prune enabled Feb 23 04:54:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e102 e102: 6 total, 6 up, 6 in Feb 23 04:54:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e102: 6 total, 6 up, 6 in Feb 23 04:54:32 localhost nova_compute[282206]: 2026-02-23 09:54:32.532 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:32 localhost nova_compute[282206]: 2026-02-23 09:54:32.541 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:54:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:54:34 localhost ovn_controller[157695]: 2026-02-23T09:54:34Z|00094|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:34 localhost nova_compute[282206]: 2026-02-23 09:54:34.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:34 localhost systemd[1]: tmp-crun.SMcMgf.mount: Deactivated successfully. Feb 23 04:54:34 localhost podman[309355]: 2026-02-23 09:54:34.933666465 +0000 UTC m=+0.102172414 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216) Feb 23 04:54:34 localhost systemd[1]: tmp-crun.EIDCKP.mount: Deactivated successfully. Feb 23 04:54:35 localhost podman[309356]: 2026-02-23 09:54:35.004433955 +0000 UTC m=+0.168407194 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:54:35 localhost podman[309355]: 2026-02-23 09:54:35.005461007 +0000 UTC m=+0.173966956 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:54:35 localhost podman[309356]: 2026-02-23 09:54:35.041382217 +0000 UTC m=+0.205355446 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:54:35 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:54:35 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:54:35 localhost nova_compute[282206]: 2026-02-23 09:54:35.341 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:36 localhost podman[309421]: 2026-02-23 09:54:36.8125633 +0000 UTC m=+0.060835177 container kill bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:54:36 localhost dnsmasq[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/addn_hosts - 0 addresses Feb 23 04:54:36 localhost dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/host Feb 23 04:54:36 localhost dnsmasq-dhcp[309191]: read /var/lib/neutron/dhcp/10e2e9cc-29ed-4970-84df-64c996e76871/opts Feb 23 04:54:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:37 localhost nova_compute[282206]: 2026-02-23 09:54:37.321 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:37 localhost kernel: device tapc54c90dc-59 left promiscuous mode Feb 23 04:54:37 localhost ovn_controller[157695]: 2026-02-23T09:54:37Z|00095|binding|INFO|Releasing lport c54c90dc-59eb-4ba6-a441-5146f8224a2f from this chassis (sb_readonly=0) Feb 23 04:54:37 localhost ovn_controller[157695]: 2026-02-23T09:54:37Z|00096|binding|INFO|Setting lport c54c90dc-59eb-4ba6-a441-5146f8224a2f down in Southbound Feb 23 04:54:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:37.331 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-10e2e9cc-29ed-4970-84df-64c996e76871', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-10e2e9cc-29ed-4970-84df-64c996e76871', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6c5e1c9f6e8d451fa766e1133da7c78c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fc3abb40-d939-4e85-874a-e1c4e0aed9c9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c54c90dc-59eb-4ba6-a441-5146f8224a2f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:37.333 163572 INFO neutron.agent.ovn.metadata.agent [-] Port c54c90dc-59eb-4ba6-a441-5146f8224a2f in datapath 10e2e9cc-29ed-4970-84df-64c996e76871 unbound from our chassis#033[00m Feb 23 04:54:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:37.337 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 10e2e9cc-29ed-4970-84df-64c996e76871, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:54:37 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:37.338 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[3aee6415-6e71-482c-9029-ef0377888538]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:54:37 localhost nova_compute[282206]: 2026-02-23 09:54:37.344 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:37 localhost nova_compute[282206]: 2026-02-23 09:54:37.537 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:37 localhost nova_compute[282206]: 2026-02-23 09:54:37.542 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:54:37 localhost systemd[1]: tmp-crun.1UZ3VI.mount: Deactivated successfully. Feb 23 04:54:37 localhost podman[309445]: 2026-02-23 09:54:37.915252006 +0000 UTC m=+0.086904045 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Feb 23 04:54:37 localhost podman[309445]: 2026-02-23 09:54:37.926081128 +0000 UTC m=+0.097733187 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:54:37 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:54:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e102 do_prune osdmap full prune enabled Feb 23 04:54:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e103 e103: 6 total, 6 up, 6 in Feb 23 04:54:39 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e103: 6 total, 6 up, 6 in Feb 23 04:54:39 localhost podman[242954]: time="2026-02-23T09:54:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:54:39 localhost podman[242954]: @ - - [23/Feb/2026:09:54:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162551 "" "Go-http-client/1.1" Feb 23 04:54:39 localhost podman[242954]: @ - - [23/Feb/2026:09:54:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20246 "" "Go-http-client/1.1" Feb 23 04:54:41 localhost neutron_sriov_agent[258207]: 2026-02-23 09:54:41.077 2 INFO neutron.agent.securitygroups_rpc [None req-ff2971e4-9240-4c6a-b550-c220da174a71 fb712af440b2428b8717631185f9fc4e 2fbe870428324feda18014285ef9eb40 - - default default] Security group member updated ['0982a37e-b389-40e3-834f-dcc14e42d01c']#033[00m Feb 23 04:54:41 localhost neutron_sriov_agent[258207]: 2026-02-23 09:54:41.367 2 INFO neutron.agent.securitygroups_rpc [None req-ff2971e4-9240-4c6a-b550-c220da174a71 fb712af440b2428b8717631185f9fc4e 2fbe870428324feda18014285ef9eb40 - - default default] Security group member updated ['0982a37e-b389-40e3-834f-dcc14e42d01c']#033[00m Feb 23 04:54:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:54:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:41 localhost podman[309465]: 2026-02-23 09:54:41.905653868 +0000 UTC m=+0.081898523 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent) Feb 23 04:54:41 localhost podman[309465]: 2026-02-23 09:54:41.916286373 +0000 UTC m=+0.092531068 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 23 04:54:41 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:54:42 localhost neutron_sriov_agent[258207]: 2026-02-23 09:54:42.131 2 INFO neutron.agent.securitygroups_rpc [None req-2742a8e1-f348-43b4-9fb2-dd2072837cd2 0b7edff084ac4cda88d2d8f5182da779 b5e1135ba2724a69b072bbda0ea8476c - - default default] Security group member updated ['5e2da0ff-f592-42de-9188-06e3b0bca61b']#033[00m Feb 23 04:54:42 localhost neutron_sriov_agent[258207]: 2026-02-23 09:54:42.229 2 INFO neutron.agent.securitygroups_rpc [None req-e6e1d23b-7f3f-4c9f-825c-f305c0d37186 fb712af440b2428b8717631185f9fc4e 2fbe870428324feda18014285ef9eb40 - - default default] Security group member updated ['0982a37e-b389-40e3-834f-dcc14e42d01c']#033[00m Feb 23 04:54:42 localhost ovn_controller[157695]: 2026-02-23T09:54:42Z|00097|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:42 localhost podman[309501]: 2026-02-23 09:54:42.4791633 +0000 UTC m=+0.061856057 container kill c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:54:42 localhost dnsmasq[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/addn_hosts - 0 addresses Feb 23 04:54:42 localhost dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/host Feb 23 04:54:42 localhost dnsmasq-dhcp[308859]: read /var/lib/neutron/dhcp/c4367d4b-271d-4a28-a878-d77074456171/opts Feb 23 04:54:42 localhost neutron_sriov_agent[258207]: 2026-02-23 09:54:42.483 2 INFO neutron.agent.securitygroups_rpc [None req-ca0adc89-f8c7-402b-bd8c-8b5b257e86b1 fb712af440b2428b8717631185f9fc4e 2fbe870428324feda18014285ef9eb40 - - default default] Security group member updated ['0982a37e-b389-40e3-834f-dcc14e42d01c']#033[00m Feb 23 04:54:42 localhost nova_compute[282206]: 2026-02-23 09:54:42.501 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:42 localhost nova_compute[282206]: 2026-02-23 09:54:42.538 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:42 localhost nova_compute[282206]: 2026-02-23 09:54:42.543 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:42 localhost nova_compute[282206]: 2026-02-23 09:54:42.960 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:43 localhost dnsmasq[308859]: exiting on receipt of SIGTERM Feb 23 04:54:43 localhost systemd[1]: tmp-crun.rU82g4.mount: Deactivated successfully. Feb 23 04:54:43 localhost podman[309538]: 2026-02-23 09:54:43.020200298 +0000 UTC m=+0.068877433 container kill c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:54:43 localhost systemd[1]: libpod-c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84.scope: Deactivated successfully. Feb 23 04:54:43 localhost podman[309552]: 2026-02-23 09:54:43.090129451 +0000 UTC m=+0.058084441 container died c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:54:43 localhost podman[309552]: 2026-02-23 09:54:43.12725574 +0000 UTC m=+0.095210690 container cleanup c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2) Feb 23 04:54:43 localhost systemd[1]: libpod-conmon-c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84.scope: Deactivated successfully. Feb 23 04:54:43 localhost podman[309556]: 2026-02-23 09:54:43.176596462 +0000 UTC m=+0.134045881 container remove c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4367d4b-271d-4a28-a878-d77074456171, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 04:54:43 localhost ovn_controller[157695]: 2026-02-23T09:54:43Z|00098|binding|INFO|Releasing lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 from this chassis (sb_readonly=0) Feb 23 04:54:43 localhost nova_compute[282206]: 2026-02-23 09:54:43.189 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:43 localhost ovn_controller[157695]: 2026-02-23T09:54:43Z|00099|binding|INFO|Setting lport 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 down in Southbound Feb 23 04:54:43 localhost kernel: device tap4cbf3d42-6e left promiscuous mode Feb 23 04:54:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:43.197 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 20a12874-8371-4400-bfd5-f2688e2d3266 with type ""#033[00m Feb 23 04:54:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:43.199 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c4367d4b-271d-4a28-a878-d77074456171', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4367d4b-271d-4a28-a878-d77074456171', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5e1135ba2724a69b072bbda0ea8476c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=57c5c75f-3246-4a64-87cf-649ab7e0f2d0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=4cbf3d42-6ec8-4d67-8923-c66e0247fcd0) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:43.201 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 4cbf3d42-6ec8-4d67-8923-c66e0247fcd0 in datapath c4367d4b-271d-4a28-a878-d77074456171 unbound from our chassis#033[00m Feb 23 04:54:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:43.204 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c4367d4b-271d-4a28-a878-d77074456171, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:54:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:43.205 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[affde97d-20ec-4fb0-be08-322cd094ef68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:54:43 localhost nova_compute[282206]: 2026-02-23 09:54:43.212 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:43.238 265541 INFO neutron.agent.dhcp.agent [None req-14252414-a13b-4f2d-867e-ef5a0d9933fd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:54:43 localhost dnsmasq[309191]: exiting on receipt of SIGTERM Feb 23 04:54:43 localhost podman[309599]: 2026-02-23 09:54:43.311596992 +0000 UTC m=+0.058833155 container kill bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:54:43 localhost systemd[1]: libpod-bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c.scope: Deactivated successfully. Feb 23 04:54:43 localhost podman[309614]: 2026-02-23 09:54:43.372051225 +0000 UTC m=+0.043855436 container died bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2) Feb 23 04:54:43 localhost openstack_network_exporter[245358]: ERROR 09:54:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:54:43 localhost openstack_network_exporter[245358]: Feb 23 04:54:43 localhost openstack_network_exporter[245358]: ERROR 09:54:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:54:43 localhost openstack_network_exporter[245358]: Feb 23 04:54:43 localhost podman[309614]: 2026-02-23 09:54:43.421639685 +0000 UTC m=+0.093443856 container remove bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-10e2e9cc-29ed-4970-84df-64c996e76871, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:54:43 localhost systemd[1]: libpod-conmon-bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c.scope: Deactivated successfully. Feb 23 04:54:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:43.488 265541 INFO neutron.agent.dhcp.agent [None req-d7b77519-726d-47a8-927b-110421a7141c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:54:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:43.576 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:54:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:43.853 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:54:43 localhost ovn_controller[157695]: 2026-02-23T09:54:43Z|00100|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:43 localhost nova_compute[282206]: 2026-02-23 09:54:43.987 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:44 localhost systemd[1]: var-lib-containers-storage-overlay-edab4fa058864b90b5cf7f18b174b68d3de95b3e3a1dd659b3050d106daf7c15-merged.mount: Deactivated successfully. Feb 23 04:54:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bafc254051d45d6a8f48958438ff8ce08d49c897b3f13b9deeb4ecf88054bd7c-userdata-shm.mount: Deactivated successfully. Feb 23 04:54:44 localhost systemd[1]: run-netns-qdhcp\x2d10e2e9cc\x2d29ed\x2d4970\x2d84df\x2d64c996e76871.mount: Deactivated successfully. Feb 23 04:54:44 localhost systemd[1]: var-lib-containers-storage-overlay-e813bd4a46c7971615ac88ffd7ed7c4cab04c48395cbaf4f846b5efffe82a9e8-merged.mount: Deactivated successfully. Feb 23 04:54:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c10cdca575a450dc369efd991d451e65c3e680818367723941494a0848776f84-userdata-shm.mount: Deactivated successfully. Feb 23 04:54:44 localhost systemd[1]: run-netns-qdhcp\x2dc4367d4b\x2d271d\x2d4a28\x2da878\x2dd77074456171.mount: Deactivated successfully. Feb 23 04:54:44 localhost neutron_sriov_agent[258207]: 2026-02-23 09:54:44.637 2 INFO neutron.agent.securitygroups_rpc [None req-c06d8773-2479-4f88-85a7-f04d29c76a1d 0b7edff084ac4cda88d2d8f5182da779 b5e1135ba2724a69b072bbda0ea8476c - - default default] Security group member updated ['5e2da0ff-f592-42de-9188-06e3b0bca61b']#033[00m Feb 23 04:54:45 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e103 do_prune osdmap full prune enabled Feb 23 04:54:45 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e104 e104: 6 total, 6 up, 6 in Feb 23 04:54:45 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e104: 6 total, 6 up, 6 in Feb 23 04:54:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e104 do_prune osdmap full prune enabled Feb 23 04:54:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e105 e105: 6 total, 6 up, 6 in Feb 23 04:54:46 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e105: 6 total, 6 up, 6 in Feb 23 04:54:46 localhost ovn_controller[157695]: 2026-02-23T09:54:46Z|00101|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:46 localhost nova_compute[282206]: 2026-02-23 09:54:46.388 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e105 do_prune osdmap full prune enabled Feb 23 04:54:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e106 e106: 6 total, 6 up, 6 in Feb 23 04:54:46 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e106: 6 total, 6 up, 6 in Feb 23 04:54:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:47.550 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:47.551 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:54:47 localhost nova_compute[282206]: 2026-02-23 09:54:47.586 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:47 localhost sshd[309641]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:54:48 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:48.312 265541 INFO neutron.agent.linux.ip_lib [None req-fce4ed71-03d3-41c5-bb15-fbd2116eca99 - - - - - -] Device tap1d4d42e8-30 cannot be used as it has no MAC address#033[00m Feb 23 04:54:48 localhost nova_compute[282206]: 2026-02-23 09:54:48.339 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:48 localhost kernel: device tap1d4d42e8-30 entered promiscuous mode Feb 23 04:54:48 localhost ovn_controller[157695]: 2026-02-23T09:54:48Z|00102|binding|INFO|Claiming lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b for this chassis. Feb 23 04:54:48 localhost ovn_controller[157695]: 2026-02-23T09:54:48Z|00103|binding|INFO|1d4d42e8-30fc-45f0-86d9-320b12bff55b: Claiming unknown Feb 23 04:54:48 localhost NetworkManager[5974]: [1771840488.3495] manager: (tap1d4d42e8-30): new Generic device (/org/freedesktop/NetworkManager/Devices/21) Feb 23 04:54:48 localhost nova_compute[282206]: 2026-02-23 09:54:48.349 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:48 localhost systemd-udevd[309652]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:54:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:48.363 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-02ee7c5a-2db8-434a-a435-61821ceb4b9b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02ee7c5a-2db8-434a-a435-61821ceb4b9b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2fbe870428324feda18014285ef9eb40', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=042074d1-ff67-46dd-af64-fc750025a9c1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d4d42e8-30fc-45f0-86d9-320b12bff55b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:48.365 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1d4d42e8-30fc-45f0-86d9-320b12bff55b in datapath 02ee7c5a-2db8-434a-a435-61821ceb4b9b bound to our chassis#033[00m Feb 23 04:54:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:48.367 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02ee7c5a-2db8-434a-a435-61821ceb4b9b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:54:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:48.368 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[3cb28397-80ec-4031-ae24-75137e90f386]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:54:48 localhost ovn_controller[157695]: 2026-02-23T09:54:48Z|00104|binding|INFO|Setting lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b ovn-installed in OVS Feb 23 04:54:48 localhost ovn_controller[157695]: 2026-02-23T09:54:48Z|00105|binding|INFO|Setting lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b up in Southbound Feb 23 04:54:48 localhost nova_compute[282206]: 2026-02-23 09:54:48.373 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:48 localhost journal[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device Feb 23 04:54:48 localhost nova_compute[282206]: 2026-02-23 09:54:48.391 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:48 localhost journal[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device Feb 23 04:54:48 localhost journal[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device Feb 23 04:54:48 localhost journal[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device Feb 23 04:54:48 localhost journal[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device Feb 23 04:54:48 localhost journal[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device Feb 23 04:54:48 localhost journal[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device Feb 23 04:54:48 localhost journal[231253]: ethtool ioctl error on tap1d4d42e8-30: No such device Feb 23 04:54:48 localhost nova_compute[282206]: 2026-02-23 09:54:48.438 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:48 localhost nova_compute[282206]: 2026-02-23 09:54:48.464 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:48.555 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:54:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:54:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:48.557 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:54:49 localhost ovn_controller[157695]: 2026-02-23T09:54:49Z|00106|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:49 localhost nova_compute[282206]: 2026-02-23 09:54:49.293 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:49 localhost podman[309723]: Feb 23 04:54:49 localhost podman[309723]: 2026-02-23 09:54:49.321670591 +0000 UTC m=+0.100597285 container create c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216) Feb 23 04:54:49 localhost systemd[1]: Started libpod-conmon-c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a.scope. Feb 23 04:54:49 localhost podman[309723]: 2026-02-23 09:54:49.282243562 +0000 UTC m=+0.061170296 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:54:49 localhost systemd[1]: tmp-crun.yZv2oU.mount: Deactivated successfully. Feb 23 04:54:49 localhost systemd[1]: Started libcrun container. Feb 23 04:54:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/590d7f40eabf0d6dfb27cf9871c5b44da87730638b147c979467440410e9b196/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:54:49 localhost podman[309723]: 2026-02-23 09:54:49.431539059 +0000 UTC m=+0.210465753 container init c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:54:49 localhost podman[309723]: 2026-02-23 09:54:49.440742081 +0000 UTC m=+0.219668775 container start c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:54:49 localhost dnsmasq[309742]: started, version 2.85 cachesize 150 Feb 23 04:54:49 localhost dnsmasq[309742]: DNS service limited to local subnets Feb 23 04:54:49 localhost dnsmasq[309742]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:54:49 localhost dnsmasq[309742]: warning: no upstream servers configured Feb 23 04:54:49 localhost dnsmasq-dhcp[309742]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:54:49 localhost dnsmasq[309742]: read /var/lib/neutron/dhcp/02ee7c5a-2db8-434a-a435-61821ceb4b9b/addn_hosts - 0 addresses Feb 23 04:54:49 localhost dnsmasq-dhcp[309742]: read /var/lib/neutron/dhcp/02ee7c5a-2db8-434a-a435-61821ceb4b9b/host Feb 23 04:54:49 localhost dnsmasq-dhcp[309742]: read /var/lib/neutron/dhcp/02ee7c5a-2db8-434a-a435-61821ceb4b9b/opts Feb 23 04:54:49 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:49.652 265541 INFO neutron.agent.dhcp.agent [None req-0d6a0230-d3ea-40b2-9bcd-7410244dd29e - - - - - -] DHCP configuration for ports {'db544d9d-8b97-42b8-8a54-79f7ab5f8b46'} is completed#033[00m Feb 23 04:54:49 localhost dnsmasq[309742]: exiting on receipt of SIGTERM Feb 23 04:54:49 localhost systemd[1]: libpod-c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a.scope: Deactivated successfully. Feb 23 04:54:49 localhost podman[309761]: 2026-02-23 09:54:49.827346434 +0000 UTC m=+0.071534034 container kill c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:54:49 localhost podman[309774]: 2026-02-23 09:54:49.905703126 +0000 UTC m=+0.067690596 container died c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:54:49 localhost podman[309774]: 2026-02-23 09:54:49.936947395 +0000 UTC m=+0.098934835 container cleanup c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:54:49 localhost systemd[1]: libpod-conmon-c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a.scope: Deactivated successfully. Feb 23 04:54:49 localhost podman[309781]: 2026-02-23 09:54:49.988100913 +0000 UTC m=+0.136069673 container remove c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02ee7c5a-2db8-434a-a435-61821ceb4b9b, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:54:50 localhost ovn_controller[157695]: 2026-02-23T09:54:50Z|00107|binding|INFO|Releasing lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b from this chassis (sb_readonly=0) Feb 23 04:54:50 localhost ovn_controller[157695]: 2026-02-23T09:54:50Z|00108|binding|INFO|Setting lport 1d4d42e8-30fc-45f0-86d9-320b12bff55b down in Southbound Feb 23 04:54:50 localhost kernel: device tap1d4d42e8-30 left promiscuous mode Feb 23 04:54:50 localhost nova_compute[282206]: 2026-02-23 09:54:50.045 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:50.051 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-02ee7c5a-2db8-434a-a435-61821ceb4b9b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02ee7c5a-2db8-434a-a435-61821ceb4b9b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2fbe870428324feda18014285ef9eb40', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=042074d1-ff67-46dd-af64-fc750025a9c1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1d4d42e8-30fc-45f0-86d9-320b12bff55b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:50.053 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1d4d42e8-30fc-45f0-86d9-320b12bff55b in datapath 02ee7c5a-2db8-434a-a435-61821ceb4b9b unbound from our chassis#033[00m Feb 23 04:54:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:50.055 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02ee7c5a-2db8-434a-a435-61821ceb4b9b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:54:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:50.056 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[641328a5-d474-45cf-acbd-5977ea6b07b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:54:50 localhost nova_compute[282206]: 2026-02-23 09:54:50.066 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:50.261 265541 INFO neutron.agent.dhcp.agent [None req-6c50aab4-86c9-4e9e-89f0-3f62c8618500 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:54:50 localhost systemd[1]: var-lib-containers-storage-overlay-590d7f40eabf0d6dfb27cf9871c5b44da87730638b147c979467440410e9b196-merged.mount: Deactivated successfully. Feb 23 04:54:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4931b1e71b8b1290fe866fefb1a5885f236e07cf1e81f48f9e248434aa23b6a-userdata-shm.mount: Deactivated successfully. Feb 23 04:54:50 localhost systemd[1]: run-netns-qdhcp\x2d02ee7c5a\x2d2db8\x2d434a\x2da435\x2d61821ceb4b9b.mount: Deactivated successfully. Feb 23 04:54:50 localhost podman[309822]: 2026-02-23 09:54:50.889410866 +0000 UTC m=+0.061537297 container kill b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:54:50 localhost dnsmasq[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/addn_hosts - 0 addresses Feb 23 04:54:50 localhost dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/host Feb 23 04:54:50 localhost dnsmasq-dhcp[308657]: read /var/lib/neutron/dhcp/b3238cd9-9eb9-4ae1-bb2b-833536c18deb/opts Feb 23 04:54:51 localhost ovn_controller[157695]: 2026-02-23T09:54:51Z|00109|binding|INFO|Releasing lport ff39110f-d5ab-4f4c-b656-11139ee6c196 from this chassis (sb_readonly=0) Feb 23 04:54:51 localhost kernel: device tapff39110f-d5 left promiscuous mode Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost ovn_controller[157695]: 2026-02-23T09:54:51Z|00110|binding|INFO|Setting lport ff39110f-d5ab-4f4c-b656-11139ee6c196 down in Southbound Feb 23 04:54:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:51.070 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b3238cd9-9eb9-4ae1-bb2b-833536c18deb', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b3238cd9-9eb9-4ae1-bb2b-833536c18deb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02917a1d904f4889b9e244e1ebfc57ca', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3fe629c8-1dc0-4c84-9b5b-6b0d444ac4ee, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ff39110f-d5ab-4f4c-b656-11139ee6c196) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:51.071 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:54:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:51.072 163572 INFO neutron.agent.ovn.metadata.agent [-] Port ff39110f-d5ab-4f4c-b656-11139ee6c196 in datapath b3238cd9-9eb9-4ae1-bb2b-833536c18deb unbound from our chassis#033[00m Feb 23 04:54:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:51.075 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b3238cd9-9eb9-4ae1-bb2b-833536c18deb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:54:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:51.077 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[6f199ec1-b5f7-41fe-9db7-7392f052d521]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.083 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.085 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:54:51 localhost ovn_controller[157695]: 2026-02-23T09:54:51Z|00111|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:51.388 265541 INFO neutron.agent.linux.ip_lib [None req-1658a744-993a-437c-9d55-5b8ee12ce82d - - - - - -] Device tap8f0c5ba8-2f cannot be used as it has no MAC address#033[00m Feb 23 04:54:51 localhost podman[309847]: 2026-02-23 09:54:51.419173718 +0000 UTC m=+0.099312896 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.423 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.427 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost kernel: device tap8f0c5ba8-2f entered promiscuous mode Feb 23 04:54:51 localhost NetworkManager[5974]: [1771840491.4347] manager: (tap8f0c5ba8-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/22) Feb 23 04:54:51 localhost ovn_controller[157695]: 2026-02-23T09:54:51Z|00112|binding|INFO|Claiming lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 for this chassis. Feb 23 04:54:51 localhost ovn_controller[157695]: 2026-02-23T09:54:51Z|00113|binding|INFO|8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431: Claiming unknown Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.436 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost systemd-udevd[309877]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:54:51 localhost podman[309847]: 2026-02-23 09:54:51.448802716 +0000 UTC m=+0.128941904 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:54:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:51.449 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0bd74f028104d8eab41537070541f77', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d55ec2a4-c84a-4c50-ac81-d4cad2b77be1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:54:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:51.451 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 in datapath c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f bound to our chassis#033[00m Feb 23 04:54:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:51.453 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:54:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:51.454 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[80f7f2ad-79a9-4004-942c-4a37ecec1d85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:54:51 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:54:51 localhost journal[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device Feb 23 04:54:51 localhost ovn_controller[157695]: 2026-02-23T09:54:51Z|00114|binding|INFO|Setting lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 ovn-installed in OVS Feb 23 04:54:51 localhost ovn_controller[157695]: 2026-02-23T09:54:51Z|00115|binding|INFO|Setting lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 up in Southbound Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.471 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost journal[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device Feb 23 04:54:51 localhost journal[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device Feb 23 04:54:51 localhost journal[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device Feb 23 04:54:51 localhost journal[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device Feb 23 04:54:51 localhost journal[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device Feb 23 04:54:51 localhost journal[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device Feb 23 04:54:51 localhost journal[231253]: ethtool ioctl error on tap8f0c5ba8-2f: No such device Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.518 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost nova_compute[282206]: 2026-02-23 09:54:51.551 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e106 do_prune osdmap full prune enabled Feb 23 04:54:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 e107: 6 total, 6 up, 6 in Feb 23 04:54:51 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e107: 6 total, 6 up, 6 in Feb 23 04:54:52 localhost podman[309949]: Feb 23 04:54:52 localhost podman[309949]: 2026-02-23 09:54:52.39583052 +0000 UTC m=+0.087282337 container create caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:54:52 localhost systemd[1]: Started libpod-conmon-caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81.scope. Feb 23 04:54:52 localhost systemd[1]: Started libcrun container. Feb 23 04:54:52 localhost podman[309949]: 2026-02-23 09:54:52.353705779 +0000 UTC m=+0.045157636 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:54:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/166f830536de571ce740513568578247182f4fa7b08cb2a3bc0f31bbb6bd274f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:54:52 localhost podman[309949]: 2026-02-23 09:54:52.468806917 +0000 UTC m=+0.160258734 container init caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS) Feb 23 04:54:52 localhost systemd[1]: tmp-crun.Hmzlmo.mount: Deactivated successfully. Feb 23 04:54:52 localhost podman[309949]: 2026-02-23 09:54:52.480120614 +0000 UTC m=+0.171572431 container start caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:54:52 localhost dnsmasq[309968]: started, version 2.85 cachesize 150 Feb 23 04:54:52 localhost dnsmasq[309968]: DNS service limited to local subnets Feb 23 04:54:52 localhost dnsmasq[309968]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:54:52 localhost dnsmasq[309968]: warning: no upstream servers configured Feb 23 04:54:52 localhost dnsmasq-dhcp[309968]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:54:52 localhost dnsmasq[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/addn_hosts - 0 addresses Feb 23 04:54:52 localhost dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/host Feb 23 04:54:52 localhost dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/opts Feb 23 04:54:52 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:52.616 265541 INFO neutron.agent.dhcp.agent [None req-efc9d436-1b4f-4d05-bf40-cc111ee43471 - - - - - -] DHCP configuration for ports {'6d4c17c0-28d2-4957-9681-dd047b0ebba7'} is completed#033[00m Feb 23 04:54:52 localhost nova_compute[282206]: 2026-02-23 09:54:52.623 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:53 localhost ovn_controller[157695]: 2026-02-23T09:54:53Z|00116|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:54:53 localhost nova_compute[282206]: 2026-02-23 09:54:53.594 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:54:53 localhost nova_compute[282206]: 2026-02-23 09:54:53.856 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:53 localhost systemd[1]: tmp-crun.I9QnIP.mount: Deactivated successfully. Feb 23 04:54:53 localhost podman[309969]: 2026-02-23 09:54:53.925662563 +0000 UTC m=+0.094600101 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, release=1770267347, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 23 04:54:53 localhost podman[309969]: 2026-02-23 09:54:53.942264782 +0000 UTC m=+0.111202310 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, version=9.7, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 23 04:54:53 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:54:55 localhost dnsmasq[308657]: exiting on receipt of SIGTERM Feb 23 04:54:55 localhost systemd[1]: libpod-b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35.scope: Deactivated successfully. Feb 23 04:54:55 localhost podman[310006]: 2026-02-23 09:54:55.206078908 +0000 UTC m=+0.072333658 container kill b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 23 04:54:55 localhost podman[310020]: 2026-02-23 09:54:55.289457514 +0000 UTC m=+0.069174391 container died b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:54:55 localhost systemd[1]: tmp-crun.ybdHWf.mount: Deactivated successfully. Feb 23 04:54:55 localhost podman[310020]: 2026-02-23 09:54:55.321694063 +0000 UTC m=+0.101410900 container cleanup b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 04:54:55 localhost systemd[1]: libpod-conmon-b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35.scope: Deactivated successfully. Feb 23 04:54:55 localhost podman[310027]: 2026-02-23 09:54:55.368172417 +0000 UTC m=+0.135640509 container remove b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b3238cd9-9eb9-4ae1-bb2b-833536c18deb, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:54:55 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:55.411 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:54:55 localhost ovn_metadata_agent[163567]: 2026-02-23 09:54:55.554 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:54:55 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:55.618 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:54:54Z, description=, device_id=daed35e2-7c42-4b10-a372-e586a9bec4ff, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=126e04eb-5ef6-4c95-9995-1dc19782e90c, ip_allocation=immediate, mac_address=fa:16:3e:63:df:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:54:48Z, description=, dns_domain=, id=c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1249780383-network, port_security_enabled=True, project_id=a0bd74f028104d8eab41537070541f77, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3813, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=781, status=ACTIVE, subnets=['aadca0bd-2aca-4d46-8acc-f53c76863e44'], tags=[], tenant_id=a0bd74f028104d8eab41537070541f77, updated_at=2026-02-23T09:54:50Z, vlan_transparent=None, network_id=c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, port_security_enabled=False, project_id=a0bd74f028104d8eab41537070541f77, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=830, status=DOWN, tags=[], tenant_id=a0bd74f028104d8eab41537070541f77, updated_at=2026-02-23T09:54:55Z on network c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f#033[00m Feb 23 04:54:55 localhost dnsmasq[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/addn_hosts - 1 addresses Feb 23 04:54:55 localhost dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/host Feb 23 04:54:55 localhost podman[310065]: 2026-02-23 09:54:55.831667708 +0000 UTC m=+0.067609874 container kill caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:54:55 localhost dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/opts Feb 23 04:54:55 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:55.920 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:54:56 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:56.083 265541 INFO neutron.agent.dhcp.agent [None req-63645ab9-f215-4269-8159-fc0bce210a6d - - - - - -] DHCP configuration for ports {'126e04eb-5ef6-4c95-9995-1dc19782e90c'} is completed#033[00m Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.145 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.160 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.163 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '12496de6-bffa-490d-9daa-577b047a4908', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.147733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b53aa722-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': 'cda72ab4ba8ab08db2f762b09e5426289ec7ad89bab3dc00aed808fc0e15c793'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.147733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b53ad38c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': 'b6112b8d6c0c3ea860beda9105ceaea4ba816277876dfbd7c833c1a1082f2415'}]}, 'timestamp': '2026-02-23 09:54:56.163990', '_unique_id': '1c3b53c35afa49d8b7333883141c5658'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.167 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96ae017d-6b70-4be5-bc11-44544e819e78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.170172', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b53c70d4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '5670fa7c8d92f9d157fed3f1841d4dcac63d832a7bc16e1ba38d2a9998a86a9f'}]}, 'timestamp': '2026-02-23 09:54:56.174613', '_unique_id': '79523850fab94a4e8fe40d3efa46a99d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.176 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.177 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.177 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd195aafe-e084-4f7e-bc5d-0d5961c5fec8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.177491', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b53cf81a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '81777c6ebc3fddea48fb59f495d51097f517cf1c47158326c1f8f1a172b5e124'}]}, 'timestamp': '2026-02-23 09:54:56.178047', '_unique_id': '91b08addf6fb4e629633d6af0122ad27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:54:56 localhost systemd[1]: var-lib-containers-storage-overlay-aecf83ec51d273f51dfbd2c3054b8e6328f27606ae999dc013ec2a626bc3eb75-merged.mount: Deactivated successfully. Feb 23 04:54:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b45d2a41c1142d2be7f1aa4bc27db6097a18330a539974ea3f84437645115e35-userdata-shm.mount: Deactivated successfully. Feb 23 04:54:56 localhost systemd[1]: run-netns-qdhcp\x2db3238cd9\x2d9eb9\x2d4ae1\x2dbb2b\x2d833536c18deb.mount: Deactivated successfully. Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.221 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16b753ca-6f35-461d-a827-1b6557019594', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.182100', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b543a61a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '13277080cab96402787c272fa8fee4abe207fd8e06c5629140fc521baeaff3ad'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.182100', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b543c08c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '21511fc604bca0fa5777744f4a8e7b8e813e8b5e895a9797f3dba1e232828536'}]}, 'timestamp': '2026-02-23 09:54:56.222469', '_unique_id': 'e1b85ae5534b471a8ffc8abb11797f98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.224 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77e135b3-bc9e-44c3-9be6-3921fff16088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.226563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54474b4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': '40e210079174950b247d807a1dc723c73c8553f0a7ac690671f09ef5fdb2598a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.226563', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b5449110-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': '0d1b04927a16acb530460c676f3b7565dc3b1eb4c3c9c4169318180c15f7fa21'}]}, 'timestamp': '2026-02-23 09:54:56.227803', '_unique_id': '9bd90f0796be4d4caacd47ac56846ae7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ac3eee6-ac8f-49fc-ab7b-92187ea44cff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.231115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b5452a80-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '14bfc4b730b4c8b6079b7aca36a053a3d012c2a994cf26de21055e861944b4c4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.231115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b5454650-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '19d312a854ecfc6f4259374753ecb2779106770ce18dd56c34d0f61a9552a3b4'}]}, 'timestamp': '2026-02-23 09:54:56.232525', '_unique_id': '7e54064a770b40048cc807b794acb0f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.235 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46fc66c1-ba89-4d12-92c9-d7e7fcc6c884', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.236394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b545f938-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '72c2776b43ffed0221e4b45b101d26821bcddb353ca248677023bb7ea7607ccd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.236394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b5460fea-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': 'c7ffac90749452e6893d6bfed09c86790d4357383ce41c342b9b5b5b4203e007'}]}, 'timestamp': '2026-02-23 09:54:56.237569', '_unique_id': '3abfa13945324cbfb36b588e9126675d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:54:56 localhost neutron_sriov_agent[258207]: 2026-02-23 09:54:56.250 2 INFO neutron.agent.securitygroups_rpc [None req-e4e33ab4-a498-48c3-b62b-d466306b0b2c 8c67fb6133284335807155391776f7a4 35e3e6665f014caf91b19ef9e685a75a - - default default] Security group member updated ['18ad37ac-3bf6-435c-949b-384a2e1dc20f']#033[00m Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 13130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c4011c7-feb8-423f-9760-4d67a6aeff40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13130000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:54:56.241176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b54954b6-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.447862772, 'message_signature': '118f84f09689002f96a8ae7d593b2c013e8eb5c76e2b947c11208b9142596fa4'}]}, 'timestamp': '2026-02-23 09:54:56.259086', '_unique_id': 'b0d681469f7743f58711def3e0fb6c1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36adebc1-d2ac-4ed6-8c90-46a64224a7a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.262327', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b549e93a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '93f6057e414dc7617c452e34c871e3ac5c098bb8afcec36213a0da66c97813e3'}]}, 'timestamp': '2026-02-23 09:54:56.262840', '_unique_id': '9da49dd6bbc048cc9f4bc387dcc72ada'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.265 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.265 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aba727f1-df44-400c-9d11-6b956bf11125', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:54:56.265600', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b54a70ee-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.447862772, 'message_signature': '85ec1429ed383a4ea2a2b08606da369b5288b8f489c861ee003227f9abae42ba'}]}, 'timestamp': '2026-02-23 09:54:56.266360', '_unique_id': 'bf3fe55251de4a9ba042ec9781f3b1f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.269 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.270 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0589513c-2fcb-4869-b120-086e3022de2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.269357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54b009a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': '67e873917b392a7a1a61eb0d8f53ca7506c03e26d4b03494f17b891b0df42894'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.269357', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b54b180a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.337230841, 'message_signature': 'dbce286e6d8e909fc0b4fa386df6db3375f1d3f5dd6935c9f00d7621681ce45e'}]}, 'timestamp': '2026-02-23 09:54:56.270551', '_unique_id': 'e5de1998b89e4621bf58c1546dd655d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.273 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9b2896e-6d3e-40db-b90d-cd8626b5c124', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.273246', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54b98e8-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '2c2cbf8e0058465d9fb0d5b46325d35a3336658630fdee3c41ad979faa2046d7'}]}, 'timestamp': '2026-02-23 09:54:56.274048', '_unique_id': 'ec88f56c5b8c44f08a9029429688f7ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.275 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.276 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4773f4f0-abe9-41a3-9fc4-e0de44ae2415', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.276600', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54c22ea-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': 'b3bd28d7839920e1556199d5d60846c2a9a8a96d58e023792dada96b9fb3b404'}]}, 'timestamp': '2026-02-23 09:54:56.277511', '_unique_id': 'b7f5d05ecdab44acaa20bf6870ffc11a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.278 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.279 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.280 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66d6317e-9b6c-4f39-8525-9a2394d2c5d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.279894', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54c948c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '931303c450476eac1cbc4b51b7f33005b8eed205737ad8a525b78d11ff6109e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.279894', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b54c9f86-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': 'f4752b9e065c74376873824fbe7bceb27bc0e38e560372d2273933f47f559dc5'}]}, 'timestamp': '2026-02-23 09:54:56.280565', '_unique_id': '2562b358cc054694a4beffecb9332865'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.281 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.282 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eedfa6c8-c180-4a85-bc25-49269f86e16f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.282512', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54cf9c2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': 'f550b42406adf324968bf67556ee49f8932a604469252de7a3093fad74149aff'}]}, 'timestamp': '2026-02-23 09:54:56.282815', '_unique_id': '609dfe2043f94029977274d536413ff1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.283 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.284 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.284 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2604a197-9685-420d-ba90-5d14560955e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.284213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54d3e8c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '52c407b65b1f92a509df30304557a3b7ec2a1bff992f59c195e09b7509ed95ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.284213', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b54d4e2c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '7c62fde312d422d98c166af4399fa2bbd8fbf11f2a37e747a00b9baaaf4621ca'}]}, 'timestamp': '2026-02-23 09:54:56.285042', '_unique_id': '806378f9789a4615980e99486fd006cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.285 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.286 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2f4e8f8-75b7-4060-b997-a0e51a59606f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.286961', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54da76e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '4bcf5ab3dc81e72d1bc2077c1dd4d96080402374899c9712028bbb0bc45376c0'}]}, 'timestamp': '2026-02-23 09:54:56.287262', '_unique_id': '04df3ed4419b4f82984b5daf09a2e29d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.287 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.288 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.288 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.289 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d26078d-87ad-442c-ada0-7760a3f82e7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:54:56.288825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b54df228-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '9c4d4f0182c375aed6528bcf2e3ca89aa77ada34c7865b7f20e61419e85c308d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:54:56.288825', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b54dfe80-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.371695197, 'message_signature': '3ff48f39c7c70094bad5c000092fa557d7b3a0424b0e7fe893c426fa49b1765b'}]}, 'timestamp': '2026-02-23 09:54:56.289557', '_unique_id': 'aa6cbc5a164e4588b3d3903931c96114'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.290 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.291 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.291 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dad6553d-9c61-4512-a01c-551023e7ee8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.291353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54e5628-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': 'd6d07421900bd9a5e15c3b50a4d8a8f626919650189a8922cd09a148f83bb2c1'}]}, 'timestamp': '2026-02-23 09:54:56.291751', '_unique_id': 'bceff6e266824e3881b1b0af72ef335a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.292 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.293 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.293 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e80d240-436b-4a87-bf84-8364b205642b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.293591', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54ea9f2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': '3ce1d0ba98ef7ea51bdea3c48cc7dfd17bb7680b3c6409dc9693f70ec9c9dd06'}]}, 'timestamp': '2026-02-23 09:54:56.293916', '_unique_id': '4c29eaa94b8f46e591a966952cd1b748'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.294 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.295 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.295 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52495aba-bd53-4ede-b021-c8dab20721b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:54:56.295492', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'b54ef43e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 11936.35971834, 'message_signature': 'fc67d181bf5d9f1f239fd13eb72a8599e24c51255801c9e41f3f6afaee62c9ca'}]}, 'timestamp': '2026-02-23 09:54:56.295780', '_unique_id': '7bff1a9cc8ea422b9bb355f896375118'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:54:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:54:56.296 12 ERROR oslo_messaging.notify.messaging Feb 23 04:54:56 localhost nova_compute[282206]: 2026-02-23 09:54:56.698 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:56 localhost neutron_sriov_agent[258207]: 2026-02-23 09:54:56.733 2 INFO neutron.agent.securitygroups_rpc [None req-5f241d63-4aea-41aa-b22a-dca2e6055a49 8c67fb6133284335807155391776f7a4 35e3e6665f014caf91b19ef9e685a75a - - default default] Security group member updated ['18ad37ac-3bf6-435c-949b-384a2e1dc20f']#033[00m Feb 23 04:54:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:54:57 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:57.393 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:54:54Z, description=, device_id=daed35e2-7c42-4b10-a372-e586a9bec4ff, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=126e04eb-5ef6-4c95-9995-1dc19782e90c, ip_allocation=immediate, mac_address=fa:16:3e:63:df:d6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:54:48Z, description=, dns_domain=, id=c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1249780383-network, port_security_enabled=True, project_id=a0bd74f028104d8eab41537070541f77, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3813, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=781, status=ACTIVE, subnets=['aadca0bd-2aca-4d46-8acc-f53c76863e44'], tags=[], tenant_id=a0bd74f028104d8eab41537070541f77, updated_at=2026-02-23T09:54:50Z, vlan_transparent=None, network_id=c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, port_security_enabled=False, project_id=a0bd74f028104d8eab41537070541f77, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=830, status=DOWN, tags=[], tenant_id=a0bd74f028104d8eab41537070541f77, updated_at=2026-02-23T09:54:55Z on network c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f#033[00m Feb 23 04:54:57 localhost dnsmasq[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/addn_hosts - 1 addresses Feb 23 04:54:57 localhost dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/host Feb 23 04:54:57 localhost dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/opts Feb 23 04:54:57 localhost podman[310100]: 2026-02-23 09:54:57.626016559 +0000 UTC m=+0.064801278 container kill caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216) Feb 23 04:54:57 localhost nova_compute[282206]: 2026-02-23 09:54:57.629 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:54:57 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:54:57.958 265541 INFO neutron.agent.dhcp.agent [None req-e608f138-ffbf-4dc8-8bd2-b1786562fd53 - - - - - -] DHCP configuration for ports {'126e04eb-5ef6-4c95-9995-1dc19782e90c'} is completed#033[00m Feb 23 04:55:01 localhost dnsmasq[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/addn_hosts - 0 addresses Feb 23 04:55:01 localhost dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/host Feb 23 04:55:01 localhost podman[310139]: 2026-02-23 09:55:01.564821196 +0000 UTC m=+0.038147080 container kill caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:55:01 localhost dnsmasq-dhcp[309968]: read /var/lib/neutron/dhcp/c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f/opts Feb 23 04:55:01 localhost ovn_controller[157695]: 2026-02-23T09:55:01Z|00117|binding|INFO|Releasing lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 from this chassis (sb_readonly=0) Feb 23 04:55:01 localhost nova_compute[282206]: 2026-02-23 09:55:01.715 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:01 localhost kernel: device tap8f0c5ba8-2f left promiscuous mode Feb 23 04:55:01 localhost ovn_controller[157695]: 2026-02-23T09:55:01Z|00118|binding|INFO|Setting lport 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 down in Southbound Feb 23 04:55:01 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:01.726 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a0bd74f028104d8eab41537070541f77', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d55ec2a4-c84a-4c50-ac81-d4cad2b77be1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:01 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:01.728 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 8f0c5ba8-2f8d-4a97-9bc4-b7dd81bb7431 in datapath c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f unbound from our chassis#033[00m Feb 23 04:55:01 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:01.732 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:55:01 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:01.733 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5d03b4c4-fbe6-4bfa-bcdc-40f3d378507c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:01 localhost nova_compute[282206]: 2026-02-23 09:55:01.739 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:02 localhost nova_compute[282206]: 2026-02-23 09:55:02.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:03 localhost nova_compute[282206]: 2026-02-23 09:55:03.013 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:03 localhost ovn_controller[157695]: 2026-02-23T09:55:03Z|00119|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:55:03 localhost nova_compute[282206]: 2026-02-23 09:55:03.175 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:03 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:03.907 2 INFO neutron.agent.securitygroups_rpc [None req-d42fa55b-3158-4a2c-8987-32ad0eaad7e9 c2b38675f57640819bf191ad8152e7cb 7f67087411544c55a9225236eb297b90 - - default default] Security group rule updated ['d03de417-eb2e-47e8-ad59-eae56add5dd4']#033[00m Feb 23 04:55:04 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:04.028 2 INFO neutron.agent.securitygroups_rpc [None req-e1f7edff-cd43-4ce1-b8ac-263a3462cef1 c2b38675f57640819bf191ad8152e7cb 7f67087411544c55a9225236eb297b90 - - default default] Security group rule updated ['d03de417-eb2e-47e8-ad59-eae56add5dd4']#033[00m Feb 23 04:55:04 localhost systemd[1]: tmp-crun.7Nw3bz.mount: Deactivated successfully. Feb 23 04:55:04 localhost dnsmasq[309968]: exiting on receipt of SIGTERM Feb 23 04:55:04 localhost podman[310180]: 2026-02-23 09:55:04.451471308 +0000 UTC m=+0.069259001 container kill caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 23 04:55:04 localhost systemd[1]: libpod-caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81.scope: Deactivated successfully. Feb 23 04:55:04 localhost podman[310192]: 2026-02-23 09:55:04.519550421 +0000 UTC m=+0.058085126 container died caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:55:04 localhost podman[310192]: 2026-02-23 09:55:04.564642264 +0000 UTC m=+0.103176929 container cleanup caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0) Feb 23 04:55:04 localhost systemd[1]: libpod-conmon-caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81.scope: Deactivated successfully. Feb 23 04:55:04 localhost podman[310199]: 2026-02-23 09:55:04.592927948 +0000 UTC m=+0.117744529 container remove caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c5dfb0c2-3b4f-4a80-ad77-54809a86fa2f, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 04:55:04 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:04.853 265541 INFO neutron.agent.dhcp.agent [None req-ff3cd298-8062-4235-a6fd-9e5d9742d7db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:04 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:04.916 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:55:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:55:05 localhost podman[310218]: 2026-02-23 09:55:05.417269506 +0000 UTC m=+0.087335319 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:55:05 localhost podman[310218]: 2026-02-23 09:55:05.429604728 +0000 UTC m=+0.099670501 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:55:05 localhost systemd[1]: tmp-crun.WhzQWp.mount: Deactivated successfully. Feb 23 04:55:05 localhost systemd[1]: var-lib-containers-storage-overlay-166f830536de571ce740513568578247182f4fa7b08cb2a3bc0f31bbb6bd274f-merged.mount: Deactivated successfully. Feb 23 04:55:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-caa7016a7c1ecbf861dfa0d41a48c5810edcc616cd59faa2f2dd0e6e56e73a81-userdata-shm.mount: Deactivated successfully. Feb 23 04:55:05 localhost systemd[1]: run-netns-qdhcp\x2dc5dfb0c2\x2d3b4f\x2d4a80\x2dad77\x2d54809a86fa2f.mount: Deactivated successfully. Feb 23 04:55:05 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:55:05 localhost podman[310217]: 2026-02-23 09:55:05.519961819 +0000 UTC m=+0.194062706 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:55:05 localhost podman[310217]: 2026-02-23 09:55:05.614299334 +0000 UTC m=+0.288400241 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:55:05 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:55:05 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:05.768 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:06 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:06.237 2 INFO neutron.agent.securitygroups_rpc [None req-ba0f42b0-8ba1-470a-9030-cc088e382d98 dcce68a4e6d440099c8b52030a278ab7 1349075215be49eda0b375e59aa77e22 - - default default] Security group member updated ['1a30abeb-10f2-4401-bae3-62a7c905b8e3']#033[00m Feb 23 04:55:06 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:06.384 2 INFO neutron.agent.securitygroups_rpc [None req-ba0f42b0-8ba1-470a-9030-cc088e382d98 dcce68a4e6d440099c8b52030a278ab7 1349075215be49eda0b375e59aa77e22 - - default default] Security group member updated ['1a30abeb-10f2-4401-bae3-62a7c905b8e3']#033[00m Feb 23 04:55:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:07 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:07.516 2 INFO neutron.agent.securitygroups_rpc [None req-d0d1b9f7-4125-4c2c-a5f0-a4480a3b4692 dcce68a4e6d440099c8b52030a278ab7 1349075215be49eda0b375e59aa77e22 - - default default] Security group member updated ['1a30abeb-10f2-4401-bae3-62a7c905b8e3']#033[00m Feb 23 04:55:07 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:07.564 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:07 localhost nova_compute[282206]: 2026-02-23 09:55:07.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:07 localhost nova_compute[282206]: 2026-02-23 09:55:07.666 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:07 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:07.877 2 INFO neutron.agent.securitygroups_rpc [None req-295aa794-6290-4e56-bed7-db18bb8fb456 dcce68a4e6d440099c8b52030a278ab7 1349075215be49eda0b375e59aa77e22 - - default default] Security group member updated ['1a30abeb-10f2-4401-bae3-62a7c905b8e3']#033[00m Feb 23 04:55:07 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:07.913 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:08 localhost nova_compute[282206]: 2026-02-23 09:55:08.761 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:55:08 localhost podman[310267]: 2026-02-23 09:55:08.91146452 +0000 UTC m=+0.083355726 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 04:55:08 localhost podman[310267]: 2026-02-23 09:55:08.927172535 +0000 UTC m=+0.099063741 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:55:08 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:55:09 localhost podman[242954]: time="2026-02-23T09:55:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:55:09 localhost podman[242954]: @ - - [23/Feb/2026:09:55:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 04:55:09 localhost podman[242954]: @ - - [23/Feb/2026:09:55:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18811 "" "Go-http-client/1.1" Feb 23 04:55:09 localhost nova_compute[282206]: 2026-02-23 09:55:09.607 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:10 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:10.290 2 INFO neutron.agent.securitygroups_rpc [req-54148787-22c3-403e-9c83-5d532e459a95 req-dcd0fbb9-2ed1-469d-8be3-13ca7cbeb9c8 c2b38675f57640819bf191ad8152e7cb 7f67087411544c55a9225236eb297b90 - - default default] Security group member updated ['d03de417-eb2e-47e8-ad59-eae56add5dd4']#033[00m Feb 23 04:55:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:12 localhost nova_compute[282206]: 2026-02-23 09:55:12.085 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:12 localhost nova_compute[282206]: 2026-02-23 09:55:12.665 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:55:12 localhost podman[310288]: 2026-02-23 09:55:12.903651482 +0000 UTC m=+0.073210122 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216) Feb 23 04:55:12 localhost podman[310288]: 2026-02-23 09:55:12.938176859 +0000 UTC m=+0.107735509 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:55:12 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:55:13 localhost openstack_network_exporter[245358]: ERROR 09:55:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:55:13 localhost openstack_network_exporter[245358]: Feb 23 04:55:13 localhost openstack_network_exporter[245358]: ERROR 09:55:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:55:13 localhost openstack_network_exporter[245358]: Feb 23 04:55:14 localhost ovn_controller[157695]: 2026-02-23T09:55:14Z|00120|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:55:14 localhost nova_compute[282206]: 2026-02-23 09:55:14.540 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:15 localhost ovn_controller[157695]: 2026-02-23T09:55:15Z|00121|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:55:15 localhost nova_compute[282206]: 2026-02-23 09:55:15.196 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:16 localhost nova_compute[282206]: 2026-02-23 09:55:16.180 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:55:17 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:55:17 localhost nova_compute[282206]: 2026-02-23 09:55:17.668 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:17 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:55:17 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:55:19 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:19.939 265541 INFO neutron.agent.linux.ip_lib [None req-17e43156-c4f0-496a-902f-89d922724e50 - - - - - -] Device tapfea38170-06 cannot be used as it has no MAC address#033[00m Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.006 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:20 localhost kernel: device tapfea38170-06 entered promiscuous mode Feb 23 04:55:20 localhost NetworkManager[5974]: [1771840520.0168] manager: (tapfea38170-06): new Generic device (/org/freedesktop/NetworkManager/Devices/23) Feb 23 04:55:20 localhost ovn_controller[157695]: 2026-02-23T09:55:20Z|00122|binding|INFO|Claiming lport fea38170-0626-427b-8a36-b82b8e008ab6 for this chassis. Feb 23 04:55:20 localhost ovn_controller[157695]: 2026-02-23T09:55:20Z|00123|binding|INFO|fea38170-0626-427b-8a36-b82b8e008ab6: Claiming unknown Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.018 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:20 localhost systemd-udevd[310401]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.025 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:20 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:20.043 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5b164f5a-6aae-4898-a6ea-a1c77a8cf652', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b164f5a-6aae-4898-a6ea-a1c77a8cf652', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5632ff1108264def864ca9b5473cb716', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cca3f636-88c2-4a23-a28f-aa045d27b076, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fea38170-0626-427b-8a36-b82b8e008ab6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:20 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:20.045 163572 INFO neutron.agent.ovn.metadata.agent [-] Port fea38170-0626-427b-8a36-b82b8e008ab6 in datapath 5b164f5a-6aae-4898-a6ea-a1c77a8cf652 bound to our chassis#033[00m Feb 23 04:55:20 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:20.047 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5b164f5a-6aae-4898-a6ea-a1c77a8cf652 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:55:20 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:20.048 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[8bd7a47a-49cb-431d-bfd3-1c208c57d7c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:20 localhost journal[231253]: ethtool ioctl error on tapfea38170-06: No such device Feb 23 04:55:20 localhost ovn_controller[157695]: 2026-02-23T09:55:20Z|00124|binding|INFO|Setting lport fea38170-0626-427b-8a36-b82b8e008ab6 ovn-installed in OVS Feb 23 04:55:20 localhost ovn_controller[157695]: 2026-02-23T09:55:20Z|00125|binding|INFO|Setting lport fea38170-0626-427b-8a36-b82b8e008ab6 up in Southbound Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.058 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:20 localhost journal[231253]: ethtool ioctl error on tapfea38170-06: No such device Feb 23 04:55:20 localhost journal[231253]: ethtool ioctl error on tapfea38170-06: No such device Feb 23 04:55:20 localhost journal[231253]: ethtool ioctl error on tapfea38170-06: No such device Feb 23 04:55:20 localhost journal[231253]: ethtool ioctl error on tapfea38170-06: No such device Feb 23 04:55:20 localhost journal[231253]: ethtool ioctl error on tapfea38170-06: No such device Feb 23 04:55:20 localhost journal[231253]: ethtool ioctl error on tapfea38170-06: No such device Feb 23 04:55:20 localhost journal[231253]: ethtool ioctl error on tapfea38170-06: No such device Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.092 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.115 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:55:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.593 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.593 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.594 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:55:20 localhost nova_compute[282206]: 2026-02-23 09:55:20.594 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:55:20 localhost podman[310472]: Feb 23 04:55:20 localhost podman[310472]: 2026-02-23 09:55:20.945994665 +0000 UTC m=+0.094701417 container create 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216) Feb 23 04:55:20 localhost systemd[1]: Started libpod-conmon-877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905.scope. Feb 23 04:55:20 localhost podman[310472]: 2026-02-23 09:55:20.897638401 +0000 UTC m=+0.046345172 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:55:21 localhost systemd[1]: Started libcrun container. Feb 23 04:55:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/930a766cf5e4b8a84c31286e57382a4788939504b12f986029e04d96b0f3a126/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:55:21 localhost podman[310472]: 2026-02-23 09:55:21.015027938 +0000 UTC m=+0.163734689 container init 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:55:21 localhost podman[310472]: 2026-02-23 09:55:21.023816569 +0000 UTC m=+0.172523310 container start 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216) Feb 23 04:55:21 localhost dnsmasq[310490]: started, version 2.85 cachesize 150 Feb 23 04:55:21 localhost dnsmasq[310490]: DNS service limited to local subnets Feb 23 04:55:21 localhost dnsmasq[310490]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:55:21 localhost dnsmasq[310490]: warning: no upstream servers configured Feb 23 04:55:21 localhost dnsmasq-dhcp[310490]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:55:21 localhost dnsmasq[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/addn_hosts - 0 addresses Feb 23 04:55:21 localhost dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/host Feb 23 04:55:21 localhost dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/opts Feb 23 04:55:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:55:21 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:21.204 265541 INFO neutron.agent.dhcp.agent [None req-a77d95f7-8e38-4044-b7ba-f69e17d3490f - - - - - -] DHCP configuration for ports {'a1c7667d-69ec-4b95-8d4d-acc5bfc0d1b5'} is completed#033[00m Feb 23 04:55:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:55:21 localhost nova_compute[282206]: 2026-02-23 09:55:21.842 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:55:21 localhost nova_compute[282206]: 2026-02-23 09:55:21.859 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:55:21 localhost nova_compute[282206]: 2026-02-23 09:55:21.860 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:55:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:21 localhost podman[310491]: 2026-02-23 09:55:21.895722828 +0000 UTC m=+0.070989125 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:55:21 localhost podman[310491]: 2026-02-23 09:55:21.908361258 +0000 UTC m=+0.083627545 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:55:21 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:55:21 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:21.926 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:22 localhost nova_compute[282206]: 2026-02-23 09:55:22.718 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:22 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:22.928 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:23 localhost nova_compute[282206]: 2026-02-23 09:55:23.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:23 localhost nova_compute[282206]: 2026-02-23 09:55:23.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:23 localhost nova_compute[282206]: 2026-02-23 09:55:23.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:55:23 localhost nova_compute[282206]: 2026-02-23 09:55:23.706 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:24 localhost nova_compute[282206]: 2026-02-23 09:55:24.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:24 localhost nova_compute[282206]: 2026-02-23 09:55:24.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:24 localhost nova_compute[282206]: 2026-02-23 09:55:24.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:24 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:24.813 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:55:24Z, description=, device_id=f4707105-8993-4923-9b08-bd7f3dfa76d5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=96c8333d-b497-441e-9fb8-278da37499dc, ip_allocation=immediate, mac_address=fa:16:3e:e3:7e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:55:17Z, description=, dns_domain=, id=5b164f5a-6aae-4898-a6ea-a1c77a8cf652, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-170814225-network, port_security_enabled=True, project_id=5632ff1108264def864ca9b5473cb716, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63310, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=955, status=ACTIVE, subnets=['a71d2280-a111-45a5-9765-078a6bc8268f'], tags=[], tenant_id=5632ff1108264def864ca9b5473cb716, updated_at=2026-02-23T09:55:18Z, vlan_transparent=None, network_id=5b164f5a-6aae-4898-a6ea-a1c77a8cf652, port_security_enabled=False, project_id=5632ff1108264def864ca9b5473cb716, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=998, status=DOWN, tags=[], tenant_id=5632ff1108264def864ca9b5473cb716, updated_at=2026-02-23T09:55:24Z on network 5b164f5a-6aae-4898-a6ea-a1c77a8cf652#033[00m Feb 23 04:55:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:55:24 localhost podman[310514]: 2026-02-23 09:55:24.900949735 +0000 UTC m=+0.071526191 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, release=1770267347, version=9.7, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64) Feb 23 04:55:24 localhost podman[310514]: 2026-02-23 09:55:24.916242758 +0000 UTC m=+0.086819204 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 23 04:55:24 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:55:25 localhost nova_compute[282206]: 2026-02-23 09:55:25.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:25 localhost podman[310548]: 2026-02-23 09:55:25.053838149 +0000 UTC m=+0.068927261 container kill 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:55:25 localhost systemd[1]: tmp-crun.lq3rCc.mount: Deactivated successfully. Feb 23 04:55:25 localhost dnsmasq[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/addn_hosts - 1 addresses Feb 23 04:55:25 localhost dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/host Feb 23 04:55:25 localhost dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/opts Feb 23 04:55:25 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:25.283 265541 INFO neutron.agent.dhcp.agent [None req-17b79acc-a3db-4c16-b59e-6f9ed9a4a168 - - - - - -] DHCP configuration for ports {'96c8333d-b497-441e-9fb8-278da37499dc'} is completed#033[00m Feb 23 04:55:25 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:25.884 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:55:24Z, description=, device_id=f4707105-8993-4923-9b08-bd7f3dfa76d5, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=96c8333d-b497-441e-9fb8-278da37499dc, ip_allocation=immediate, mac_address=fa:16:3e:e3:7e:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:55:17Z, description=, dns_domain=, id=5b164f5a-6aae-4898-a6ea-a1c77a8cf652, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-170814225-network, port_security_enabled=True, project_id=5632ff1108264def864ca9b5473cb716, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63310, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=955, status=ACTIVE, subnets=['a71d2280-a111-45a5-9765-078a6bc8268f'], tags=[], tenant_id=5632ff1108264def864ca9b5473cb716, updated_at=2026-02-23T09:55:18Z, vlan_transparent=None, network_id=5b164f5a-6aae-4898-a6ea-a1c77a8cf652, port_security_enabled=False, project_id=5632ff1108264def864ca9b5473cb716, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=998, status=DOWN, tags=[], tenant_id=5632ff1108264def864ca9b5473cb716, updated_at=2026-02-23T09:55:24Z on network 5b164f5a-6aae-4898-a6ea-a1c77a8cf652#033[00m Feb 23 04:55:26 localhost nova_compute[282206]: 2026-02-23 09:55:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:26 localhost podman[310585]: 2026-02-23 09:55:26.118085189 +0000 UTC m=+0.068650342 container kill 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:55:26 localhost dnsmasq[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/addn_hosts - 1 addresses Feb 23 04:55:26 localhost dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/host Feb 23 04:55:26 localhost dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/opts Feb 23 04:55:26 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:26.813 265541 INFO neutron.agent.dhcp.agent [None req-36e0b5c7-2f90-42f4-a8e4-18f2d34148ce - - - - - -] DHCP configuration for ports {'96c8333d-b497-441e-9fb8-278da37499dc'} is completed#033[00m Feb 23 04:55:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.067 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.067 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.068 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.068 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.068 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:55:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:55:27 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2239828795' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.512 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.578 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.579 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.721 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.817 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.818 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11380MB free_disk=41.77389907836914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.819 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.820 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.919 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.920 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.921 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:55:27 localhost nova_compute[282206]: 2026-02-23 09:55:27.964 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:55:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:55:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1845912676' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:55:28 localhost nova_compute[282206]: 2026-02-23 09:55:28.420 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:55:28 localhost nova_compute[282206]: 2026-02-23 09:55:28.427 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:55:28 localhost nova_compute[282206]: 2026-02-23 09:55:28.444 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:55:28 localhost nova_compute[282206]: 2026-02-23 09:55:28.447 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:55:28 localhost nova_compute[282206]: 2026-02-23 09:55:28.447 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:55:28 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:28.601 2 INFO neutron.agent.securitygroups_rpc [None req-9a0eab4a-b1f8-4315-8f9f-526f0b4e43e4 6d15d44765db469a9e04a32fb56dcff2 2ac6a6009ea84eb99f60bd242e459002 - - default default] Security group member updated ['917bfa8c-752a-4a55-9acc-5ce6144207b4']#033[00m Feb 23 04:55:28 localhost nova_compute[282206]: 2026-02-23 09:55:28.900 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:29 localhost nova_compute[282206]: 2026-02-23 09:55:29.448 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:55:30 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:30.654 265541 INFO neutron.agent.linux.ip_lib [None req-c9653e48-282d-47cb-9d54-ee84f65f6f1a - - - - - -] Device tap5e31c1f9-f2 cannot be used as it has no MAC address#033[00m Feb 23 04:55:30 localhost nova_compute[282206]: 2026-02-23 09:55:30.675 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:30 localhost kernel: device tap5e31c1f9-f2 entered promiscuous mode Feb 23 04:55:30 localhost nova_compute[282206]: 2026-02-23 09:55:30.683 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:30 localhost NetworkManager[5974]: [1771840530.6837] manager: (tap5e31c1f9-f2): new Generic device (/org/freedesktop/NetworkManager/Devices/24) Feb 23 04:55:30 localhost ovn_controller[157695]: 2026-02-23T09:55:30Z|00126|binding|INFO|Claiming lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a for this chassis. Feb 23 04:55:30 localhost ovn_controller[157695]: 2026-02-23T09:55:30Z|00127|binding|INFO|5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a: Claiming unknown Feb 23 04:55:30 localhost systemd-udevd[310659]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:55:30 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:30.694 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b207e42e-4d3c-43ce-b855-2d1a36797be6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b207e42e-4d3c-43ce-b855-2d1a36797be6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0b7f1d9-1471-4000-b583-343082500ed7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:30 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:30.696 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a in datapath b207e42e-4d3c-43ce-b855-2d1a36797be6 bound to our chassis#033[00m Feb 23 04:55:30 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:30.699 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b207e42e-4d3c-43ce-b855-2d1a36797be6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:55:30 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:30.701 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[ed3b815d-04a2-4053-9b16-a81e541838b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:30 localhost journal[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device Feb 23 04:55:30 localhost ovn_controller[157695]: 2026-02-23T09:55:30Z|00128|binding|INFO|Setting lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a ovn-installed in OVS Feb 23 04:55:30 localhost ovn_controller[157695]: 2026-02-23T09:55:30Z|00129|binding|INFO|Setting lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a up in Southbound Feb 23 04:55:30 localhost nova_compute[282206]: 2026-02-23 09:55:30.729 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:30 localhost journal[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device Feb 23 04:55:30 localhost journal[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device Feb 23 04:55:30 localhost journal[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device Feb 23 04:55:30 localhost journal[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device Feb 23 04:55:30 localhost journal[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device Feb 23 04:55:30 localhost journal[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device Feb 23 04:55:30 localhost journal[231253]: ethtool ioctl error on tap5e31c1f9-f2: No such device Feb 23 04:55:30 localhost nova_compute[282206]: 2026-02-23 09:55:30.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:30 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:30.775 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:30 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:30.776 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:55:30 localhost nova_compute[282206]: 2026-02-23 09:55:30.779 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:30 localhost nova_compute[282206]: 2026-02-23 09:55:30.798 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:31 localhost podman[310730]: Feb 23 04:55:31 localhost podman[310730]: 2026-02-23 09:55:31.574125426 +0000 UTC m=+0.095762110 container create 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216) Feb 23 04:55:31 localhost systemd[1]: Started libpod-conmon-2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1.scope. Feb 23 04:55:31 localhost podman[310730]: 2026-02-23 09:55:31.531789568 +0000 UTC m=+0.053426242 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:55:31 localhost systemd[1]: Started libcrun container. Feb 23 04:55:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/85d2c0758da427144c001a7e97c7359bf4b87ae2a86e80ab66c7f916f65db929/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:55:31 localhost podman[310730]: 2026-02-23 09:55:31.655509711 +0000 UTC m=+0.177146345 container init 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 23 04:55:31 localhost podman[310730]: 2026-02-23 09:55:31.666424318 +0000 UTC m=+0.188060952 container start 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:55:31 localhost dnsmasq[310749]: started, version 2.85 cachesize 150 Feb 23 04:55:31 localhost dnsmasq[310749]: DNS service limited to local subnets Feb 23 04:55:31 localhost dnsmasq[310749]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:55:31 localhost dnsmasq[310749]: warning: no upstream servers configured Feb 23 04:55:31 localhost dnsmasq-dhcp[310749]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Feb 23 04:55:31 localhost dnsmasq[310749]: read /var/lib/neutron/dhcp/b207e42e-4d3c-43ce-b855-2d1a36797be6/addn_hosts - 0 addresses Feb 23 04:55:31 localhost dnsmasq-dhcp[310749]: read /var/lib/neutron/dhcp/b207e42e-4d3c-43ce-b855-2d1a36797be6/host Feb 23 04:55:31 localhost dnsmasq-dhcp[310749]: read /var/lib/neutron/dhcp/b207e42e-4d3c-43ce-b855-2d1a36797be6/opts Feb 23 04:55:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:32.089 265541 INFO neutron.agent.dhcp.agent [None req-279e7834-41a1-4ec5-859a-2b8e7a4535b7 - - - - - -] DHCP configuration for ports {'24d8afdd-ed5e-4b36-b6ce-17f8eb8c09d3'} is completed#033[00m Feb 23 04:55:32 localhost nova_compute[282206]: 2026-02-23 09:55:32.759 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:32.778 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:55:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e107 do_prune osdmap full prune enabled Feb 23 04:55:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e108 e108: 6 total, 6 up, 6 in Feb 23 04:55:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e108: 6 total, 6 up, 6 in Feb 23 04:55:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:33.238 265541 INFO neutron.agent.linux.ip_lib [None req-5b78d67a-d67a-4488-96a5-505670629ace - - - - - -] Device tap6025ad38-91 cannot be used as it has no MAC address#033[00m Feb 23 04:55:33 localhost nova_compute[282206]: 2026-02-23 09:55:33.260 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:33 localhost kernel: device tap6025ad38-91 entered promiscuous mode Feb 23 04:55:33 localhost NetworkManager[5974]: [1771840533.2670] manager: (tap6025ad38-91): new Generic device (/org/freedesktop/NetworkManager/Devices/25) Feb 23 04:55:33 localhost nova_compute[282206]: 2026-02-23 09:55:33.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:33 localhost systemd-udevd[310662]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:55:33 localhost ovn_controller[157695]: 2026-02-23T09:55:33Z|00130|binding|INFO|Claiming lport 6025ad38-916c-468c-898d-3a9de80cd0c9 for this chassis. Feb 23 04:55:33 localhost ovn_controller[157695]: 2026-02-23T09:55:33Z|00131|binding|INFO|6025ad38-916c-468c-898d-3a9de80cd0c9: Claiming unknown Feb 23 04:55:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:33.279 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-8523f038-ac71-4b3d-b11f-1dcce416acd1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8523f038-ac71-4b3d-b11f-1dcce416acd1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb2ad155-4f5a-4d4b-8819-d0aef1f50516, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6025ad38-916c-468c-898d-3a9de80cd0c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:33.281 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6025ad38-916c-468c-898d-3a9de80cd0c9 in datapath 8523f038-ac71-4b3d-b11f-1dcce416acd1 bound to our chassis#033[00m Feb 23 04:55:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:33.284 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8523f038-ac71-4b3d-b11f-1dcce416acd1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:55:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:33.285 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[53ac0be8-7757-42c8-a1e6-045c3f4e15fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:33 localhost journal[231253]: ethtool ioctl error on tap6025ad38-91: No such device Feb 23 04:55:33 localhost ovn_controller[157695]: 2026-02-23T09:55:33Z|00132|binding|INFO|Setting lport 6025ad38-916c-468c-898d-3a9de80cd0c9 ovn-installed in OVS Feb 23 04:55:33 localhost ovn_controller[157695]: 2026-02-23T09:55:33Z|00133|binding|INFO|Setting lport 6025ad38-916c-468c-898d-3a9de80cd0c9 up in Southbound Feb 23 04:55:33 localhost journal[231253]: ethtool ioctl error on tap6025ad38-91: No such device Feb 23 04:55:33 localhost nova_compute[282206]: 2026-02-23 09:55:33.304 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:33 localhost journal[231253]: ethtool ioctl error on tap6025ad38-91: No such device Feb 23 04:55:33 localhost journal[231253]: ethtool ioctl error on tap6025ad38-91: No such device Feb 23 04:55:33 localhost journal[231253]: ethtool ioctl error on tap6025ad38-91: No such device Feb 23 04:55:33 localhost journal[231253]: ethtool ioctl error on tap6025ad38-91: No such device Feb 23 04:55:33 localhost journal[231253]: ethtool ioctl error on tap6025ad38-91: No such device Feb 23 04:55:33 localhost journal[231253]: ethtool ioctl error on tap6025ad38-91: No such device Feb 23 04:55:33 localhost nova_compute[282206]: 2026-02-23 09:55:33.345 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:33 localhost nova_compute[282206]: 2026-02-23 09:55:33.377 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:34 localhost ovn_controller[157695]: 2026-02-23T09:55:34Z|00134|binding|INFO|Removing iface tap6025ad38-91 ovn-installed in OVS Feb 23 04:55:34 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:34.039 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 8eebb565-f801-41e8-8e69-e55ef03184ba with type ""#033[00m Feb 23 04:55:34 localhost ovn_controller[157695]: 2026-02-23T09:55:34Z|00135|binding|INFO|Removing lport 6025ad38-916c-468c-898d-3a9de80cd0c9 ovn-installed in OVS Feb 23 04:55:34 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:34.075 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-8523f038-ac71-4b3d-b11f-1dcce416acd1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8523f038-ac71-4b3d-b11f-1dcce416acd1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb2ad155-4f5a-4d4b-8819-d0aef1f50516, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6025ad38-916c-468c-898d-3a9de80cd0c9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:34 localhost nova_compute[282206]: 2026-02-23 09:55:34.075 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:34 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:34.077 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6025ad38-916c-468c-898d-3a9de80cd0c9 in datapath 8523f038-ac71-4b3d-b11f-1dcce416acd1 unbound from our chassis#033[00m Feb 23 04:55:34 localhost nova_compute[282206]: 2026-02-23 09:55:34.079 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:34 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:34.080 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8523f038-ac71-4b3d-b11f-1dcce416acd1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:55:34 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:34.081 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[4b02b6e3-0607-4ae4-afe6-a599187fa219]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:34 localhost ovn_controller[157695]: 2026-02-23T09:55:34Z|00136|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:55:34 localhost podman[310829]: Feb 23 04:55:34 localhost podman[310829]: 2026-02-23 09:55:34.194685189 +0000 UTC m=+0.089000251 container create a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2) Feb 23 04:55:34 localhost nova_compute[282206]: 2026-02-23 09:55:34.222 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:34 localhost systemd[1]: Started libpod-conmon-a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb.scope. Feb 23 04:55:34 localhost podman[310829]: 2026-02-23 09:55:34.151816205 +0000 UTC m=+0.046131297 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:55:34 localhost systemd[1]: tmp-crun.3aeZMY.mount: Deactivated successfully. Feb 23 04:55:34 localhost systemd[1]: Started libcrun container. Feb 23 04:55:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b6228693ebd24685afa42cc46e8b1f978c093b0e70be55a48dc426f2ec7a90c0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:55:34 localhost podman[310829]: 2026-02-23 09:55:34.280038616 +0000 UTC m=+0.174353678 container init a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:55:34 localhost podman[310829]: 2026-02-23 09:55:34.289083556 +0000 UTC m=+0.183398618 container start a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:55:34 localhost dnsmasq[310845]: started, version 2.85 cachesize 150 Feb 23 04:55:34 localhost dnsmasq[310845]: DNS service limited to local subnets Feb 23 04:55:34 localhost dnsmasq[310845]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:55:34 localhost dnsmasq[310845]: warning: no upstream servers configured Feb 23 04:55:34 localhost dnsmasq-dhcp[310845]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:55:34 localhost dnsmasq[310845]: read /var/lib/neutron/dhcp/8523f038-ac71-4b3d-b11f-1dcce416acd1/addn_hosts - 0 addresses Feb 23 04:55:34 localhost dnsmasq-dhcp[310845]: read /var/lib/neutron/dhcp/8523f038-ac71-4b3d-b11f-1dcce416acd1/host Feb 23 04:55:34 localhost dnsmasq-dhcp[310845]: read /var/lib/neutron/dhcp/8523f038-ac71-4b3d-b11f-1dcce416acd1/opts Feb 23 04:55:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:34.420 265541 INFO neutron.agent.dhcp.agent [None req-7458c64c-f1da-42ea-8af0-2ba340d0e403 - - - - - -] DHCP configuration for ports {'65f10a64-150b-4c93-8c26-165b199a7803'} is completed#033[00m Feb 23 04:55:34 localhost dnsmasq[310845]: exiting on receipt of SIGTERM Feb 23 04:55:34 localhost podman[310862]: 2026-02-23 09:55:34.540665169 +0000 UTC m=+0.065008130 container kill a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:55:34 localhost systemd[1]: libpod-a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb.scope: Deactivated successfully. Feb 23 04:55:34 localhost podman[310876]: 2026-02-23 09:55:34.604282404 +0000 UTC m=+0.050326926 container died a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:55:34 localhost podman[310876]: 2026-02-23 09:55:34.690136217 +0000 UTC m=+0.136180689 container cleanup a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:55:34 localhost systemd[1]: libpod-conmon-a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb.scope: Deactivated successfully. Feb 23 04:55:34 localhost podman[310883]: 2026-02-23 09:55:34.716909283 +0000 UTC m=+0.143806073 container remove a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8523f038-ac71-4b3d-b11f-1dcce416acd1, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:55:34 localhost kernel: device tap6025ad38-91 left promiscuous mode Feb 23 04:55:34 localhost nova_compute[282206]: 2026-02-23 09:55:34.733 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:34 localhost nova_compute[282206]: 2026-02-23 09:55:34.750 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:34.775 265541 INFO neutron.agent.dhcp.agent [None req-0faf63b8-4fac-4327-9992-7e942b5fe495 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:34.775 265541 INFO neutron.agent.dhcp.agent [None req-0faf63b8-4fac-4327-9992-7e942b5fe495 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e108 do_prune osdmap full prune enabled Feb 23 04:55:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e109 e109: 6 total, 6 up, 6 in Feb 23 04:55:35 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e109: 6 total, 6 up, 6 in Feb 23 04:55:35 localhost systemd[1]: var-lib-containers-storage-overlay-b6228693ebd24685afa42cc46e8b1f978c093b0e70be55a48dc426f2ec7a90c0-merged.mount: Deactivated successfully. Feb 23 04:55:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a511d8aab5b7a0c853198896612e723afc41d439d12988cb6f377af15bc217fb-userdata-shm.mount: Deactivated successfully. Feb 23 04:55:35 localhost systemd[1]: run-netns-qdhcp\x2d8523f038\x2dac71\x2d4b3d\x2db11f\x2d1dcce416acd1.mount: Deactivated successfully. Feb 23 04:55:35 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:35.592 2 INFO neutron.agent.securitygroups_rpc [None req-0cb29720-58ee-4ab0-99f8-69e7c954667c 6d15d44765db469a9e04a32fb56dcff2 2ac6a6009ea84eb99f60bd242e459002 - - default default] Security group member updated ['917bfa8c-752a-4a55-9acc-5ce6144207b4']#033[00m Feb 23 04:55:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:55:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:55:35 localhost podman[310905]: 2026-02-23 09:55:35.898674404 +0000 UTC m=+0.067515446 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:55:35 localhost podman[310905]: 2026-02-23 09:55:35.960425982 +0000 UTC m=+0.129266974 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:55:35 localhost systemd[1]: tmp-crun.yr3v2d.mount: Deactivated successfully. Feb 23 04:55:35 localhost podman[310906]: 2026-02-23 09:55:35.979341677 +0000 UTC m=+0.145761124 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:55:35 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:55:36 localhost podman[310906]: 2026-02-23 09:55:36.011753229 +0000 UTC m=+0.178172656 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:55:36 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:55:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e109 do_prune osdmap full prune enabled Feb 23 04:55:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e110 e110: 6 total, 6 up, 6 in Feb 23 04:55:36 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e110: 6 total, 6 up, 6 in Feb 23 04:55:36 localhost sshd[310952]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:55:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:37 localhost nova_compute[282206]: 2026-02-23 09:55:37.165 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:37 localhost nova_compute[282206]: 2026-02-23 09:55:37.766 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e110 do_prune osdmap full prune enabled Feb 23 04:55:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e111 e111: 6 total, 6 up, 6 in Feb 23 04:55:37 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e111: 6 total, 6 up, 6 in Feb 23 04:55:38 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:38.726 2 INFO neutron.agent.securitygroups_rpc [req-2c9e84fe-f5f5-4169-b610-b000c50ec955 req-5f426135-7d9d-4897-a8f1-4e578256ef9c b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['81d638c1-b5b2-4310-a6ad-c12f8ffa8182']#033[00m Feb 23 04:55:39 localhost podman[242954]: time="2026-02-23T09:55:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:55:39 localhost podman[242954]: @ - - [23/Feb/2026:09:55:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160729 "" "Go-http-client/1.1" Feb 23 04:55:39 localhost podman[242954]: @ - - [23/Feb/2026:09:55:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19760 "" "Go-http-client/1.1" Feb 23 04:55:39 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:39.719 2 INFO neutron.agent.securitygroups_rpc [req-2f94c104-6551-4f26-9e12-afea6d919b19 req-a08f01d8-ef48-4fc3-bf4b-c9f57e9a499f b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['81fcdb25-34e2-4e01-b6c6-c95398c61f96']#033[00m Feb 23 04:55:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:55:39 localhost podman[310954]: 2026-02-23 09:55:39.915043292 +0000 UTC m=+0.082222431 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:55:39 localhost podman[310954]: 2026-02-23 09:55:39.925666651 +0000 UTC m=+0.092845820 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:55:39 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:55:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e111 do_prune osdmap full prune enabled Feb 23 04:55:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e112 e112: 6 total, 6 up, 6 in Feb 23 04:55:40 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e112: 6 total, 6 up, 6 in Feb 23 04:55:40 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:40.228 2 INFO neutron.agent.securitygroups_rpc [req-740eea5f-ed9c-433b-ad37-bd212433a1f7 req-c27bf517-3a13-4ed5-83bb-bf5a421e5259 c2b38675f57640819bf191ad8152e7cb 7f67087411544c55a9225236eb297b90 - - default default] Security group member updated ['d03de417-eb2e-47e8-ad59-eae56add5dd4']#033[00m Feb 23 04:55:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e112 do_prune osdmap full prune enabled Feb 23 04:55:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e113 e113: 6 total, 6 up, 6 in Feb 23 04:55:41 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e113: 6 total, 6 up, 6 in Feb 23 04:55:41 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:41.455 2 INFO neutron.agent.securitygroups_rpc [req-e8f689e1-cc88-4c2b-896f-8a7df5cfb707 req-464689d7-a67a-4f0a-8a5e-818033ca8861 b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['dafd3ce0-31be-4a51-acc9-61744d386010']#033[00m Feb 23 04:55:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e113 do_prune osdmap full prune enabled Feb 23 04:55:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e114 e114: 6 total, 6 up, 6 in Feb 23 04:55:41 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e114: 6 total, 6 up, 6 in Feb 23 04:55:42 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:42.486 2 INFO neutron.agent.securitygroups_rpc [req-1b7ecdc5-863d-4ab9-b539-c81bbfea1261 req-49751f6e-ab71-482b-8a57-c1aca7f7d635 b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['315ff60a-a295-4b8a-bcc8-fd8b624c828e']#033[00m Feb 23 04:55:42 localhost nova_compute[282206]: 2026-02-23 09:55:42.768 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:55:42 localhost nova_compute[282206]: 2026-02-23 09:55:42.770 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:55:42 localhost nova_compute[282206]: 2026-02-23 09:55:42.770 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:55:42 localhost nova_compute[282206]: 2026-02-23 09:55:42.770 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:55:42 localhost nova_compute[282206]: 2026-02-23 09:55:42.826 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:42 localhost nova_compute[282206]: 2026-02-23 09:55:42.827 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:55:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e114 do_prune osdmap full prune enabled Feb 23 04:55:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e115 e115: 6 total, 6 up, 6 in Feb 23 04:55:42 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e115: 6 total, 6 up, 6 in Feb 23 04:55:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:55:43 localhost podman[310976]: 2026-02-23 09:55:43.226745347 +0000 UTC m=+0.088502434 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:55:43 localhost podman[310976]: 2026-02-23 09:55:43.232299909 +0000 UTC m=+0.094056926 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:55:43 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:55:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:43.247 265541 INFO neutron.agent.linux.ip_lib [None req-7079d0a7-0d2a-463c-8608-29e21df312c5 - - - - - -] Device tap2aa5a3c8-b2 cannot be used as it has no MAC address#033[00m Feb 23 04:55:43 localhost nova_compute[282206]: 2026-02-23 09:55:43.277 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:43 localhost kernel: device tap2aa5a3c8-b2 entered promiscuous mode Feb 23 04:55:43 localhost nova_compute[282206]: 2026-02-23 09:55:43.286 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:43 localhost NetworkManager[5974]: [1771840543.2873] manager: (tap2aa5a3c8-b2): new Generic device (/org/freedesktop/NetworkManager/Devices/26) Feb 23 04:55:43 localhost ovn_controller[157695]: 2026-02-23T09:55:43Z|00137|binding|INFO|Claiming lport 2aa5a3c8-b285-43de-863b-75a8af32f886 for this chassis. Feb 23 04:55:43 localhost ovn_controller[157695]: 2026-02-23T09:55:43Z|00138|binding|INFO|2aa5a3c8-b285-43de-863b-75a8af32f886: Claiming unknown Feb 23 04:55:43 localhost systemd-udevd[311002]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:55:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:43.306 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-03fa0e8f-af23-4fd5-aa8c-5de2330e1869', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03fa0e8f-af23-4fd5-aa8c-5de2330e1869', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a5faee4-697a-4afe-96ef-26362544bf3c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2aa5a3c8-b285-43de-863b-75a8af32f886) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:43.308 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 2aa5a3c8-b285-43de-863b-75a8af32f886 in datapath 03fa0e8f-af23-4fd5-aa8c-5de2330e1869 bound to our chassis#033[00m Feb 23 04:55:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:43.310 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 03fa0e8f-af23-4fd5-aa8c-5de2330e1869 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:55:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:43.312 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[6598eeb4-9108-4f74-8efc-665c1ff79dab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:43 localhost journal[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device Feb 23 04:55:43 localhost ovn_controller[157695]: 2026-02-23T09:55:43Z|00139|binding|INFO|Setting lport 2aa5a3c8-b285-43de-863b-75a8af32f886 ovn-installed in OVS Feb 23 04:55:43 localhost ovn_controller[157695]: 2026-02-23T09:55:43Z|00140|binding|INFO|Setting lport 2aa5a3c8-b285-43de-863b-75a8af32f886 up in Southbound Feb 23 04:55:43 localhost nova_compute[282206]: 2026-02-23 09:55:43.323 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:43 localhost journal[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device Feb 23 04:55:43 localhost journal[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device Feb 23 04:55:43 localhost journal[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device Feb 23 04:55:43 localhost journal[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device Feb 23 04:55:43 localhost journal[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device Feb 23 04:55:43 localhost journal[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device Feb 23 04:55:43 localhost journal[231253]: ethtool ioctl error on tap2aa5a3c8-b2: No such device Feb 23 04:55:43 localhost nova_compute[282206]: 2026-02-23 09:55:43.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:43 localhost openstack_network_exporter[245358]: ERROR 09:55:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:55:43 localhost openstack_network_exporter[245358]: Feb 23 04:55:43 localhost openstack_network_exporter[245358]: ERROR 09:55:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:55:43 localhost openstack_network_exporter[245358]: Feb 23 04:55:43 localhost nova_compute[282206]: 2026-02-23 09:55:43.414 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e115 do_prune osdmap full prune enabled Feb 23 04:55:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e116 e116: 6 total, 6 up, 6 in Feb 23 04:55:44 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e116: 6 total, 6 up, 6 in Feb 23 04:55:44 localhost podman[311071]: Feb 23 04:55:44 localhost podman[311071]: 2026-02-23 09:55:44.225702021 +0000 UTC m=+0.096765561 container create 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 23 04:55:44 localhost systemd[1]: Started libpod-conmon-4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99.scope. Feb 23 04:55:44 localhost podman[311071]: 2026-02-23 09:55:44.178261205 +0000 UTC m=+0.049324785 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:55:44 localhost systemd[1]: Started libcrun container. Feb 23 04:55:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76ea2f986b386bcf7db108926dc2e34cd1d13749b6df6aaab1304793a77d4243/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:55:44 localhost podman[311071]: 2026-02-23 09:55:44.323250305 +0000 UTC m=+0.194313845 container init 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:55:44 localhost podman[311071]: 2026-02-23 09:55:44.331993515 +0000 UTC m=+0.203057015 container start 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:55:44 localhost dnsmasq[311089]: started, version 2.85 cachesize 150 Feb 23 04:55:44 localhost dnsmasq[311089]: DNS service limited to local subnets Feb 23 04:55:44 localhost dnsmasq[311089]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:55:44 localhost dnsmasq[311089]: warning: no upstream servers configured Feb 23 04:55:44 localhost dnsmasq-dhcp[311089]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:55:44 localhost dnsmasq[311089]: read /var/lib/neutron/dhcp/03fa0e8f-af23-4fd5-aa8c-5de2330e1869/addn_hosts - 0 addresses Feb 23 04:55:44 localhost dnsmasq-dhcp[311089]: read /var/lib/neutron/dhcp/03fa0e8f-af23-4fd5-aa8c-5de2330e1869/host Feb 23 04:55:44 localhost dnsmasq-dhcp[311089]: read /var/lib/neutron/dhcp/03fa0e8f-af23-4fd5-aa8c-5de2330e1869/opts Feb 23 04:55:44 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:44.356 2 INFO neutron.agent.securitygroups_rpc [req-66b36902-2e90-44c7-97eb-062607295697 req-4da6358e-447d-4c62-9578-e6f0834d0e3a b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['6e64f6d6-976d-4cdf-bc43-87f175a49821']#033[00m Feb 23 04:55:44 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:44.503 265541 INFO neutron.agent.dhcp.agent [None req-0228ef53-fe82-400d-a28a-61496c4c13ac - - - - - -] DHCP configuration for ports {'badfeffc-25c8-4fd8-98d9-84c11f857262'} is completed#033[00m Feb 23 04:55:44 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:44.710 2 INFO neutron.agent.securitygroups_rpc [req-06a39d77-6c9e-4d1d-991d-0c622d1b8570 req-e582e36b-a1ce-42ef-aae5-7f3dd95ea0e7 b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['6e64f6d6-976d-4cdf-bc43-87f175a49821']#033[00m Feb 23 04:55:44 localhost dnsmasq[311089]: exiting on receipt of SIGTERM Feb 23 04:55:44 localhost podman[311108]: 2026-02-23 09:55:44.718749014 +0000 UTC m=+0.060211352 container kill 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:55:44 localhost systemd[1]: libpod-4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99.scope: Deactivated successfully. Feb 23 04:55:44 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:44.760 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 101d2419-fd78-4ca6-a2ca-7ccf1bd2d588 with type ""#033[00m Feb 23 04:55:44 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:44.762 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-03fa0e8f-af23-4fd5-aa8c-5de2330e1869', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-03fa0e8f-af23-4fd5-aa8c-5de2330e1869', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a5faee4-697a-4afe-96ef-26362544bf3c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2aa5a3c8-b285-43de-863b-75a8af32f886) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:44 localhost ovn_controller[157695]: 2026-02-23T09:55:44Z|00141|binding|INFO|Removing iface tap2aa5a3c8-b2 ovn-installed in OVS Feb 23 04:55:44 localhost ovn_controller[157695]: 2026-02-23T09:55:44Z|00142|binding|INFO|Removing lport 2aa5a3c8-b285-43de-863b-75a8af32f886 ovn-installed in OVS Feb 23 04:55:44 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:44.763 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 2aa5a3c8-b285-43de-863b-75a8af32f886 in datapath 03fa0e8f-af23-4fd5-aa8c-5de2330e1869 unbound from our chassis#033[00m Feb 23 04:55:44 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:44.764 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 03fa0e8f-af23-4fd5-aa8c-5de2330e1869 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:55:44 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:44.764 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[89cdfb2a-c016-4139-a26c-bcba61b9b990]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:44 localhost nova_compute[282206]: 2026-02-23 09:55:44.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:44 localhost nova_compute[282206]: 2026-02-23 09:55:44.771 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:44 localhost podman[311122]: 2026-02-23 09:55:44.796512316 +0000 UTC m=+0.062147871 container died 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:55:44 localhost podman[311122]: 2026-02-23 09:55:44.882215524 +0000 UTC m=+0.147851059 container cleanup 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:55:44 localhost systemd[1]: libpod-conmon-4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99.scope: Deactivated successfully. Feb 23 04:55:44 localhost podman[311124]: 2026-02-23 09:55:44.911211259 +0000 UTC m=+0.171006084 container remove 4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-03fa0e8f-af23-4fd5-aa8c-5de2330e1869, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0) Feb 23 04:55:44 localhost nova_compute[282206]: 2026-02-23 09:55:44.923 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:44 localhost kernel: device tap2aa5a3c8-b2 left promiscuous mode Feb 23 04:55:44 localhost nova_compute[282206]: 2026-02-23 09:55:44.940 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:44 localhost ovn_controller[157695]: 2026-02-23T09:55:44Z|00143|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:55:44 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:44.960 265541 INFO neutron.agent.dhcp.agent [None req-bebac8b9-8225-4d6f-82da-0e727140ba2e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:44 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:44.961 265541 INFO neutron.agent.dhcp.agent [None req-bebac8b9-8225-4d6f-82da-0e727140ba2e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:45 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e116 do_prune osdmap full prune enabled Feb 23 04:55:45 localhost nova_compute[282206]: 2026-02-23 09:55:45.008 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:45 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e117 e117: 6 total, 6 up, 6 in Feb 23 04:55:45 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e117: 6 total, 6 up, 6 in Feb 23 04:55:45 localhost systemd[1]: var-lib-containers-storage-overlay-76ea2f986b386bcf7db108926dc2e34cd1d13749b6df6aaab1304793a77d4243-merged.mount: Deactivated successfully. Feb 23 04:55:45 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f44c2730494b5d7469d7d535852233bf0bae7a5ec4d3488127612de54537d99-userdata-shm.mount: Deactivated successfully. Feb 23 04:55:45 localhost systemd[1]: run-netns-qdhcp\x2d03fa0e8f\x2daf23\x2d4fd5\x2daa8c\x2d5de2330e1869.mount: Deactivated successfully. Feb 23 04:55:45 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:45.352 2 INFO neutron.agent.securitygroups_rpc [req-b47c5ca0-09bc-4b25-97b3-180f5e0f18ac req-fd279825-b764-4bf6-b8d3-f7613a7508ce b2cb9c14c50346658af8c86574d3a360 91db788359a945be921785f05bf8c883 - - default default] Security group rule updated ['6e64f6d6-976d-4cdf-bc43-87f175a49821']#033[00m Feb 23 04:55:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e117 do_prune osdmap full prune enabled Feb 23 04:55:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e118 e118: 6 total, 6 up, 6 in Feb 23 04:55:46 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e118: 6 total, 6 up, 6 in Feb 23 04:55:46 localhost ovn_controller[157695]: 2026-02-23T09:55:46Z|00144|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:55:46 localhost nova_compute[282206]: 2026-02-23 09:55:46.440 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e118 do_prune osdmap full prune enabled Feb 23 04:55:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e119 e119: 6 total, 6 up, 6 in Feb 23 04:55:46 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e119: 6 total, 6 up, 6 in Feb 23 04:55:47 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:47.027 2 INFO neutron.agent.securitygroups_rpc [None req-f20a6c5c-ae1e-41e5-8a0b-e142fc8dd656 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:55:47 localhost nova_compute[282206]: 2026-02-23 09:55:47.828 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:47 localhost nova_compute[282206]: 2026-02-23 09:55:47.830 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e119 do_prune osdmap full prune enabled Feb 23 04:55:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e120 e120: 6 total, 6 up, 6 in Feb 23 04:55:48 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e120: 6 total, 6 up, 6 in Feb 23 04:55:48 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:48.503 2 INFO neutron.agent.securitygroups_rpc [None req-1ad0cbd3-986c-404f-b323-25b4bb76d296 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:55:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:55:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:55:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:48.556 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:55:48 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:48.673 2 INFO neutron.agent.securitygroups_rpc [None req-1ad0cbd3-986c-404f-b323-25b4bb76d296 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:55:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e120 do_prune osdmap full prune enabled Feb 23 04:55:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e121 e121: 6 total, 6 up, 6 in Feb 23 04:55:49 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e121: 6 total, 6 up, 6 in Feb 23 04:55:49 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:49.287 2 INFO neutron.agent.securitygroups_rpc [None req-2c61be2a-9b0c-4b49-b66c-1a782048b54f 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:55:49 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:49.313 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e121 do_prune osdmap full prune enabled Feb 23 04:55:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e122 e122: 6 total, 6 up, 6 in Feb 23 04:55:50 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e122: 6 total, 6 up, 6 in Feb 23 04:55:50 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:50.433 2 INFO neutron.agent.securitygroups_rpc [None req-0bde3ec4-1f43-4d89-82d0-564202a4897c 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:55:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:51.117 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e122 do_prune osdmap full prune enabled Feb 23 04:55:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e123 e123: 6 total, 6 up, 6 in Feb 23 04:55:51 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e123: 6 total, 6 up, 6 in Feb 23 04:55:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:52 localhost ovn_controller[157695]: 2026-02-23T09:55:52Z|00145|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:55:52 localhost nova_compute[282206]: 2026-02-23 09:55:52.815 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:55:52 localhost nova_compute[282206]: 2026-02-23 09:55:52.829 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:52 localhost nova_compute[282206]: 2026-02-23 09:55:52.834 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:52 localhost systemd[1]: tmp-crun.qFacVN.mount: Deactivated successfully. Feb 23 04:55:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e123 do_prune osdmap full prune enabled Feb 23 04:55:52 localhost podman[311152]: 2026-02-23 09:55:52.924584758 +0000 UTC m=+0.092913811 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:55:52 localhost podman[311152]: 2026-02-23 09:55:52.961364964 +0000 UTC m=+0.129694067 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:55:52 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:55:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e124 e124: 6 total, 6 up, 6 in Feb 23 04:55:53 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e124: 6 total, 6 up, 6 in Feb 23 04:55:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:53.975 265541 INFO neutron.agent.linux.ip_lib [None req-81e8d427-46c9-456f-9ee1-b5d20f69f376 - - - - - -] Device tape39b42ef-29 cannot be used as it has no MAC address#033[00m Feb 23 04:55:53 localhost nova_compute[282206]: 2026-02-23 09:55:53.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:54 localhost kernel: device tape39b42ef-29 entered promiscuous mode Feb 23 04:55:54 localhost ovn_controller[157695]: 2026-02-23T09:55:54Z|00146|binding|INFO|Claiming lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f for this chassis. Feb 23 04:55:54 localhost ovn_controller[157695]: 2026-02-23T09:55:54Z|00147|binding|INFO|e39b42ef-2915-4d7d-bb0f-f93a8d18df3f: Claiming unknown Feb 23 04:55:54 localhost nova_compute[282206]: 2026-02-23 09:55:54.004 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:54 localhost NetworkManager[5974]: [1771840554.0082] manager: (tape39b42ef-29): new Generic device (/org/freedesktop/NetworkManager/Devices/27) Feb 23 04:55:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e124 do_prune osdmap full prune enabled Feb 23 04:55:54 localhost systemd-udevd[311186]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:55:54 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:54.033 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-462b226e-df7b-4026-91be-fef5d89fea0c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-462b226e-df7b-4026-91be-fef5d89fea0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ad6b648-6ffe-4ae2-bf96-781afc55b826, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e39b42ef-2915-4d7d-bb0f-f93a8d18df3f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:54 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:54.034 163572 INFO neutron.agent.ovn.metadata.agent [-] Port e39b42ef-2915-4d7d-bb0f-f93a8d18df3f in datapath 462b226e-df7b-4026-91be-fef5d89fea0c bound to our chassis#033[00m Feb 23 04:55:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e125 e125: 6 total, 6 up, 6 in Feb 23 04:55:54 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:54.037 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 462b226e-df7b-4026-91be-fef5d89fea0c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:55:54 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:54.037 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[dee78a72-12b7-45c2-a6d3-38a83ce6dc83]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:54 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e125: 6 total, 6 up, 6 in Feb 23 04:55:54 localhost ovn_controller[157695]: 2026-02-23T09:55:54Z|00148|binding|INFO|Setting lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f ovn-installed in OVS Feb 23 04:55:54 localhost ovn_controller[157695]: 2026-02-23T09:55:54Z|00149|binding|INFO|Setting lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f up in Southbound Feb 23 04:55:54 localhost nova_compute[282206]: 2026-02-23 09:55:54.080 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:54 localhost nova_compute[282206]: 2026-02-23 09:55:54.090 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:54 localhost nova_compute[282206]: 2026-02-23 09:55:54.121 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:55 localhost podman[311241]: Feb 23 04:55:55 localhost podman[311241]: 2026-02-23 09:55:55.024278059 +0000 UTC m=+0.093846100 container create 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:55:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:55:55 localhost systemd[1]: Started libpod-conmon-9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7.scope. Feb 23 04:55:55 localhost podman[311241]: 2026-02-23 09:55:54.978061851 +0000 UTC m=+0.047629912 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:55:55 localhost systemd[1]: Started libcrun container. Feb 23 04:55:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/244af4fa1319ccb844fecc1fd3464421321a9916252948b6f4a6c8b69c4eec9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:55:55 localhost podman[311241]: 2026-02-23 09:55:55.103822587 +0000 UTC m=+0.173390618 container init 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:55:55 localhost podman[311241]: 2026-02-23 09:55:55.114730424 +0000 UTC m=+0.184298455 container start 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:55:55 localhost dnsmasq[311270]: started, version 2.85 cachesize 150 Feb 23 04:55:55 localhost dnsmasq[311270]: DNS service limited to local subnets Feb 23 04:55:55 localhost dnsmasq[311270]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:55:55 localhost dnsmasq[311270]: warning: no upstream servers configured Feb 23 04:55:55 localhost dnsmasq-dhcp[311270]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:55:55 localhost dnsmasq[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/addn_hosts - 0 addresses Feb 23 04:55:55 localhost dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/host Feb 23 04:55:55 localhost dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/opts Feb 23 04:55:55 localhost podman[311255]: 2026-02-23 09:55:55.159154487 +0000 UTC m=+0.088415013 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 04:55:55 localhost podman[311255]: 2026-02-23 09:55:55.200024529 +0000 UTC m=+0.129285075 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public) Feb 23 04:55:55 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:55:55 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:55.318 265541 INFO neutron.agent.dhcp.agent [None req-f1e60851-382a-47cf-8cc4-b5b23388d7f1 - - - - - -] DHCP configuration for ports {'3f0b7505-ef5e-4e16-9dc2-7cc00255b2e3'} is completed#033[00m Feb 23 04:55:56 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:56.398 2 INFO neutron.agent.securitygroups_rpc [None req-680f6195-a81a-4ace-93bb-ca63b4542035 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:55:56 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:56.436 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:55:56Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ee23af99-5af9-4088-9933-048b02e82885, ip_allocation=immediate, mac_address=fa:16:3e:c7:c9:6f, name=tempest-PortsTestJSON-272504090, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:55:51Z, description=, dns_domain=, id=462b226e-df7b-4026-91be-fef5d89fea0c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-445099310, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37265, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1186, status=ACTIVE, subnets=['c82bacf0-e9f5-4b47-ab92-0189f37d0778'], tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:55:53Z, vlan_transparent=None, network_id=462b226e-df7b-4026-91be-fef5d89fea0c, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['709ad995-bfde-4096-a0b4-2ba30248a611'], standard_attr_id=1222, status=DOWN, tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:55:56Z on network 462b226e-df7b-4026-91be-fef5d89fea0c#033[00m Feb 23 04:55:56 localhost podman[311295]: 2026-02-23 09:55:56.791501188 +0000 UTC m=+0.062850662 container kill 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0) Feb 23 04:55:56 localhost dnsmasq[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/addn_hosts - 1 addresses Feb 23 04:55:56 localhost dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/host Feb 23 04:55:56 localhost dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/opts Feb 23 04:55:56 localhost systemd[1]: tmp-crun.j6tApH.mount: Deactivated successfully. Feb 23 04:55:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e125 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:55:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e125 do_prune osdmap full prune enabled Feb 23 04:55:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e126 e126: 6 total, 6 up, 6 in Feb 23 04:55:56 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e126: 6 total, 6 up, 6 in Feb 23 04:55:57 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:57.087 265541 INFO neutron.agent.dhcp.agent [None req-39c42241-d377-4afd-9487-6b705568ac4c - - - - - -] DHCP configuration for ports {'ee23af99-5af9-4088-9933-048b02e82885'} is completed#033[00m Feb 23 04:55:57 localhost neutron_sriov_agent[258207]: 2026-02-23 09:55:57.365 2 INFO neutron.agent.securitygroups_rpc [None req-e495f557-19e6-4684-b099-a0de07319228 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:55:57 localhost dnsmasq[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/addn_hosts - 0 addresses Feb 23 04:55:57 localhost podman[311332]: 2026-02-23 09:55:57.592071502 +0000 UTC m=+0.042495853 container kill 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:55:57 localhost dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/host Feb 23 04:55:57 localhost dnsmasq-dhcp[311270]: read /var/lib/neutron/dhcp/462b226e-df7b-4026-91be-fef5d89fea0c/opts Feb 23 04:55:57 localhost nova_compute[282206]: 2026-02-23 09:55:57.832 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:57 localhost nova_compute[282206]: 2026-02-23 09:55:57.837 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:58 localhost systemd[1]: tmp-crun.JAlg08.mount: Deactivated successfully. Feb 23 04:55:58 localhost dnsmasq[311270]: exiting on receipt of SIGTERM Feb 23 04:55:58 localhost podman[311370]: 2026-02-23 09:55:58.222238182 +0000 UTC m=+0.081252602 container kill 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:55:58 localhost systemd[1]: libpod-9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7.scope: Deactivated successfully. Feb 23 04:55:58 localhost podman[311384]: 2026-02-23 09:55:58.290904054 +0000 UTC m=+0.056793287 container died 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:55:58 localhost podman[311384]: 2026-02-23 09:55:58.326853974 +0000 UTC m=+0.092743167 container cleanup 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2) Feb 23 04:55:58 localhost systemd[1]: libpod-conmon-9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7.scope: Deactivated successfully. Feb 23 04:55:58 localhost podman[311386]: 2026-02-23 09:55:58.370270745 +0000 UTC m=+0.128675146 container remove 9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-462b226e-df7b-4026-91be-fef5d89fea0c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 23 04:55:58 localhost kernel: device tape39b42ef-29 left promiscuous mode Feb 23 04:55:58 localhost nova_compute[282206]: 2026-02-23 09:55:58.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:58 localhost ovn_controller[157695]: 2026-02-23T09:55:58Z|00150|binding|INFO|Releasing lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f from this chassis (sb_readonly=0) Feb 23 04:55:58 localhost ovn_controller[157695]: 2026-02-23T09:55:58Z|00151|binding|INFO|Setting lport e39b42ef-2915-4d7d-bb0f-f93a8d18df3f down in Southbound Feb 23 04:55:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:58.433 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-462b226e-df7b-4026-91be-fef5d89fea0c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-462b226e-df7b-4026-91be-fef5d89fea0c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ad6b648-6ffe-4ae2-bf96-781afc55b826, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e39b42ef-2915-4d7d-bb0f-f93a8d18df3f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:55:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:58.436 163572 INFO neutron.agent.ovn.metadata.agent [-] Port e39b42ef-2915-4d7d-bb0f-f93a8d18df3f in datapath 462b226e-df7b-4026-91be-fef5d89fea0c unbound from our chassis#033[00m Feb 23 04:55:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:58.440 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 462b226e-df7b-4026-91be-fef5d89fea0c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:55:58 localhost nova_compute[282206]: 2026-02-23 09:55:58.440 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:55:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:55:58.441 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[db23f5ee-c122-4895-8b60-38c829ce0e18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:55:58 localhost systemd[1]: var-lib-containers-storage-overlay-244af4fa1319ccb844fecc1fd3464421321a9916252948b6f4a6c8b69c4eec9a-merged.mount: Deactivated successfully. Feb 23 04:55:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9a5fe7243542d4da164ca84f8618b5d7467b4823396a362458b2528c43320bb7-userdata-shm.mount: Deactivated successfully. Feb 23 04:55:59 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:59.008 265541 INFO neutron.agent.dhcp.agent [None req-0c8823e9-1756-435e-8972-19e10f09bef2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:59 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:59.009 265541 INFO neutron.agent.dhcp.agent [None req-0c8823e9-1756-435e-8972-19e10f09bef2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:59 localhost systemd[1]: run-netns-qdhcp\x2d462b226e\x2ddf7b\x2d4026\x2d91be\x2dfef5d89fea0c.mount: Deactivated successfully. Feb 23 04:55:59 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:55:59.311 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:55:59 localhost ovn_controller[157695]: 2026-02-23T09:55:59Z|00152|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:55:59 localhost nova_compute[282206]: 2026-02-23 09:55:59.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e126 do_prune osdmap full prune enabled Feb 23 04:56:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 e127: 6 total, 6 up, 6 in Feb 23 04:56:01 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e127: 6 total, 6 up, 6 in Feb 23 04:56:02 localhost nova_compute[282206]: 2026-02-23 09:56:02.834 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:02 localhost nova_compute[282206]: 2026-02-23 09:56:02.838 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:04 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:04.001 265541 INFO neutron.agent.linux.ip_lib [None req-adac8943-853d-42f8-b694-c25be0846689 - - - - - -] Device tap635b363d-ef cannot be used as it has no MAC address#033[00m Feb 23 04:56:04 localhost nova_compute[282206]: 2026-02-23 09:56:04.023 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:04 localhost kernel: device tap635b363d-ef entered promiscuous mode Feb 23 04:56:04 localhost NetworkManager[5974]: [1771840564.0316] manager: (tap635b363d-ef): new Generic device (/org/freedesktop/NetworkManager/Devices/28) Feb 23 04:56:04 localhost nova_compute[282206]: 2026-02-23 09:56:04.032 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:04 localhost systemd-udevd[311423]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:56:04 localhost ovn_controller[157695]: 2026-02-23T09:56:04Z|00153|binding|INFO|Claiming lport 635b363d-ef8c-4e25-843f-da965f86fee0 for this chassis. Feb 23 04:56:04 localhost ovn_controller[157695]: 2026-02-23T09:56:04Z|00154|binding|INFO|635b363d-ef8c-4e25-843f-da965f86fee0: Claiming unknown Feb 23 04:56:04 localhost journal[231253]: ethtool ioctl error on tap635b363d-ef: No such device Feb 23 04:56:04 localhost journal[231253]: ethtool ioctl error on tap635b363d-ef: No such device Feb 23 04:56:04 localhost journal[231253]: ethtool ioctl error on tap635b363d-ef: No such device Feb 23 04:56:04 localhost nova_compute[282206]: 2026-02-23 09:56:04.066 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:04 localhost ovn_controller[157695]: 2026-02-23T09:56:04Z|00155|binding|INFO|Setting lport 635b363d-ef8c-4e25-843f-da965f86fee0 ovn-installed in OVS Feb 23 04:56:04 localhost nova_compute[282206]: 2026-02-23 09:56:04.069 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:04 localhost nova_compute[282206]: 2026-02-23 09:56:04.070 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:04 localhost journal[231253]: ethtool ioctl error on tap635b363d-ef: No such device Feb 23 04:56:04 localhost journal[231253]: ethtool ioctl error on tap635b363d-ef: No such device Feb 23 04:56:04 localhost journal[231253]: ethtool ioctl error on tap635b363d-ef: No such device Feb 23 04:56:04 localhost journal[231253]: ethtool ioctl error on tap635b363d-ef: No such device Feb 23 04:56:04 localhost journal[231253]: ethtool ioctl error on tap635b363d-ef: No such device Feb 23 04:56:04 localhost nova_compute[282206]: 2026-02-23 09:56:04.105 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:04 localhost nova_compute[282206]: 2026-02-23 09:56:04.130 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:04 localhost ovn_controller[157695]: 2026-02-23T09:56:04Z|00156|binding|INFO|Setting lport 635b363d-ef8c-4e25-843f-da965f86fee0 up in Southbound Feb 23 04:56:04 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:04.207 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6302580f-701e-45c5-96d0-5d526435f898, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=635b363d-ef8c-4e25-843f-da965f86fee0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:04 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:04.209 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 635b363d-ef8c-4e25-843f-da965f86fee0 in datapath a74642b2-dd5d-4d6b-b98a-2a45bd6773c0 bound to our chassis#033[00m Feb 23 04:56:04 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:04.211 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network a74642b2-dd5d-4d6b-b98a-2a45bd6773c0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:04 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:04.212 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c7dcbd8d-edd9-4485-8c00-5508df5fd057]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:04 localhost nova_compute[282206]: 2026-02-23 09:56:04.254 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:05 localhost podman[311494]: Feb 23 04:56:05 localhost podman[311494]: 2026-02-23 09:56:05.259807161 +0000 UTC m=+0.091125346 container create 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 04:56:05 localhost systemd[1]: Started libpod-conmon-130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785.scope. Feb 23 04:56:05 localhost podman[311494]: 2026-02-23 09:56:05.215299166 +0000 UTC m=+0.046617361 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:56:05 localhost systemd[1]: Started libcrun container. Feb 23 04:56:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7238c05bef8095068d10408be5e9c281faac1078934f1d2694534ecaf86cdf18/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:56:05 localhost podman[311494]: 2026-02-23 09:56:05.332526767 +0000 UTC m=+0.163844932 container init 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:05 localhost podman[311494]: 2026-02-23 09:56:05.342547708 +0000 UTC m=+0.173865873 container start 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:56:05 localhost dnsmasq[311512]: started, version 2.85 cachesize 150 Feb 23 04:56:05 localhost dnsmasq[311512]: DNS service limited to local subnets Feb 23 04:56:05 localhost dnsmasq[311512]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:56:05 localhost dnsmasq[311512]: warning: no upstream servers configured Feb 23 04:56:05 localhost dnsmasq-dhcp[311512]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:56:05 localhost dnsmasq[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/addn_hosts - 0 addresses Feb 23 04:56:05 localhost dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/host Feb 23 04:56:05 localhost dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/opts Feb 23 04:56:05 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:05.653 265541 INFO neutron.agent.dhcp.agent [None req-6782edd9-f124-453d-a0ec-26d036c007c0 - - - - - -] DHCP configuration for ports {'4f6e98e8-9bd4-4004-9cb5-c86e3901ec62'} is completed#033[00m Feb 23 04:56:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:56:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:56:06 localhost podman[311514]: 2026-02-23 09:56:06.152102248 +0000 UTC m=+0.072455559 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:56:06 localhost podman[311514]: 2026-02-23 09:56:06.161534949 +0000 UTC m=+0.081888320 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:56:06 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:56:06 localhost podman[311513]: 2026-02-23 09:56:06.207179769 +0000 UTC m=+0.130488922 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 04:56:06 localhost podman[311513]: 2026-02-23 09:56:06.279348069 +0000 UTC m=+0.202657272 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0) Feb 23 04:56:06 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:56:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:07 localhost nova_compute[282206]: 2026-02-23 09:56:07.208 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:07 localhost nova_compute[282206]: 2026-02-23 09:56:07.835 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:07 localhost nova_compute[282206]: 2026-02-23 09:56:07.841 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:08 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:08.066 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:07Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b540ec8b-5a2e-4ae4-8a26-f1d68cdc922c, ip_allocation=immediate, mac_address=fa:16:3e:eb:dc:34, name=tempest-PortsTestJSON-835694374, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:00Z, description=, dns_domain=, id=a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-582323247, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44895, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1258, status=ACTIVE, subnets=['0067f710-f96d-4ecf-888a-9e2b98e326fd'], tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:02Z, vlan_transparent=None, network_id=a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1297, status=DOWN, tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:07Z on network a74642b2-dd5d-4d6b-b98a-2a45bd6773c0#033[00m Feb 23 04:56:08 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:08.216 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:08 localhost dnsmasq[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/addn_hosts - 1 addresses Feb 23 04:56:08 localhost dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/host Feb 23 04:56:08 localhost podman[311579]: 2026-02-23 09:56:08.355483322 +0000 UTC m=+0.060243681 container kill 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:56:08 localhost dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/opts Feb 23 04:56:08 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:08.607 265541 INFO neutron.agent.dhcp.agent [None req-af2cc72e-afdd-4e3b-a3c2-9fdfc5a27c76 - - - - - -] DHCP configuration for ports {'b540ec8b-5a2e-4ae4-8a26-f1d68cdc922c'} is completed#033[00m Feb 23 04:56:09 localhost podman[242954]: time="2026-02-23T09:56:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:56:09 localhost podman[242954]: @ - - [23/Feb/2026:09:56:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162553 "" "Go-http-client/1.1" Feb 23 04:56:09 localhost podman[242954]: @ - - [23/Feb/2026:09:56:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20244 "" "Go-http-client/1.1" Feb 23 04:56:09 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:09.556 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:09 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:09.557 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:56:09 localhost nova_compute[282206]: 2026-02-23 09:56:09.598 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:09 localhost dnsmasq[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/addn_hosts - 0 addresses Feb 23 04:56:09 localhost dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/host Feb 23 04:56:09 localhost dnsmasq-dhcp[311512]: read /var/lib/neutron/dhcp/a74642b2-dd5d-4d6b-b98a-2a45bd6773c0/opts Feb 23 04:56:09 localhost podman[311616]: 2026-02-23 09:56:09.866737114 +0000 UTC m=+0.063560665 container kill 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:56:09 localhost systemd[1]: tmp-crun.Nq1gu7.mount: Deactivated successfully. Feb 23 04:56:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:56:10 localhost podman[311637]: 2026-02-23 09:56:10.905229678 +0000 UTC m=+0.081391656 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_managed=true) Feb 23 04:56:10 localhost podman[311637]: 2026-02-23 09:56:10.9156334 +0000 UTC m=+0.091795378 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 04:56:10 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:56:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:56:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 3171 writes, 26K keys, 3171 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.08 MB/s#012Cumulative WAL: 3171 writes, 3171 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3171 writes, 26K keys, 3171 commit groups, 1.0 writes per commit group, ingest: 48.04 MB, 0.08 MB/s#012Interval WAL: 3171 writes, 3171 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 151.6 0.23 0.09 12 0.019 0 0 0.0 0.0#012 L6 1/0 17.05 MB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 5.2 155.7 141.7 1.28 0.48 11 0.116 129K 5605 0.0 0.0#012 Sum 1/0 17.05 MB 0.0 0.2 0.0 0.2 0.2 0.1 0.0 6.2 131.9 143.2 1.51 0.57 23 0.066 129K 5605 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.2 0.1 0.0 6.2 132.2 143.5 1.51 0.57 22 0.068 129K 5605 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 0.0 155.7 141.7 1.28 0.48 11 0.116 129K 5605 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 153.8 0.23 0.09 11 0.021 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.034, interval 0.034#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.21 GB write, 0.36 MB/s write, 0.19 GB read, 0.33 MB/s read, 1.5 seconds#012Interval compaction: 0.21 GB write, 0.36 MB/s write, 0.19 GB read, 0.33 MB/s read, 1.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5609fbab9350#2 capacity: 308.00 MB usage: 33.10 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000317 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2166,32.23 MB,10.4648%) FilterBlock(23,375.17 KB,0.118954%) IndexBlock(23,512.08 KB,0.162362%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 23 04:56:11 localhost dnsmasq[311512]: exiting on receipt of SIGTERM Feb 23 04:56:11 localhost systemd[1]: libpod-130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785.scope: Deactivated successfully. Feb 23 04:56:11 localhost podman[311671]: 2026-02-23 09:56:11.591007316 +0000 UTC m=+0.056424705 container kill 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216) Feb 23 04:56:11 localhost podman[311683]: 2026-02-23 09:56:11.668804359 +0000 UTC m=+0.061472560 container died 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:11 localhost podman[311683]: 2026-02-23 09:56:11.697398843 +0000 UTC m=+0.090067004 container cleanup 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:11 localhost systemd[1]: libpod-conmon-130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785.scope: Deactivated successfully. Feb 23 04:56:11 localhost podman[311685]: 2026-02-23 09:56:11.742470535 +0000 UTC m=+0.129989877 container remove 130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:56:11 localhost nova_compute[282206]: 2026-02-23 09:56:11.791 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:11 localhost kernel: device tap635b363d-ef left promiscuous mode Feb 23 04:56:11 localhost ovn_controller[157695]: 2026-02-23T09:56:11Z|00157|binding|INFO|Releasing lport 635b363d-ef8c-4e25-843f-da965f86fee0 from this chassis (sb_readonly=0) Feb 23 04:56:11 localhost ovn_controller[157695]: 2026-02-23T09:56:11Z|00158|binding|INFO|Setting lport 635b363d-ef8c-4e25-843f-da965f86fee0 down in Southbound Feb 23 04:56:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:11.803 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a74642b2-dd5d-4d6b-b98a-2a45bd6773c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6302580f-701e-45c5-96d0-5d526435f898, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=635b363d-ef8c-4e25-843f-da965f86fee0) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:11.804 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 635b363d-ef8c-4e25-843f-da965f86fee0 in datapath a74642b2-dd5d-4d6b-b98a-2a45bd6773c0 unbound from our chassis#033[00m Feb 23 04:56:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:11.808 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a74642b2-dd5d-4d6b-b98a-2a45bd6773c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:56:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:11.809 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[e95a620b-b8dc-49bb-9e35-ae4ffc346a86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:11 localhost nova_compute[282206]: 2026-02-23 09:56:11.817 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:11 localhost systemd[1]: var-lib-containers-storage-overlay-7238c05bef8095068d10408be5e9c281faac1078934f1d2694534ecaf86cdf18-merged.mount: Deactivated successfully. Feb 23 04:56:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-130a4bfce8aa5bdc5aa0546a6a9f179af2123d546ecc2cb5ba40343f56380785-userdata-shm.mount: Deactivated successfully. Feb 23 04:56:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:12 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:12.075 265541 INFO neutron.agent.dhcp.agent [None req-76bdfc34-db0d-4a92-8381-da44c2c45924 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:12 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:12.075 265541 INFO neutron.agent.dhcp.agent [None req-76bdfc34-db0d-4a92-8381-da44c2c45924 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:12 localhost systemd[1]: run-netns-qdhcp\x2da74642b2\x2ddd5d\x2d4d6b\x2db98a\x2d2a45bd6773c0.mount: Deactivated successfully. Feb 23 04:56:12 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:12.446 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:12 localhost nova_compute[282206]: 2026-02-23 09:56:12.766 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:12 localhost ovn_controller[157695]: 2026-02-23T09:56:12Z|00159|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:12 localhost nova_compute[282206]: 2026-02-23 09:56:12.813 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:12 localhost nova_compute[282206]: 2026-02-23 09:56:12.874 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:13 localhost openstack_network_exporter[245358]: ERROR 09:56:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:56:13 localhost openstack_network_exporter[245358]: Feb 23 04:56:13 localhost openstack_network_exporter[245358]: ERROR 09:56:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:56:13 localhost openstack_network_exporter[245358]: Feb 23 04:56:13 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:13.401 2 INFO neutron.agent.securitygroups_rpc [None req-d12b9e97-0a30-4af0-bab2-d9a3d950dae1 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:56:13 localhost systemd[1]: tmp-crun.gkeYSB.mount: Deactivated successfully. Feb 23 04:56:13 localhost podman[311714]: 2026-02-23 09:56:13.91641726 +0000 UTC m=+0.091573129 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 23 04:56:13 localhost podman[311714]: 2026-02-23 09:56:13.925278754 +0000 UTC m=+0.100434613 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 04:56:13 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:56:14 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:14.159 2 INFO neutron.agent.securitygroups_rpc [None req-2b12f6d0-e3fd-4fa1-a330-93a66177eb38 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:14 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:14.691 2 INFO neutron.agent.securitygroups_rpc [None req-313a7d3e-1b0f-4380-96f5-be20bc42956f 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:15.559 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:56:15 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:15.661 2 INFO neutron.agent.securitygroups_rpc [None req-ab6e9fb7-3784-4829-9f74-5b432c230863 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:16 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:16.418 2 INFO neutron.agent.securitygroups_rpc [None req-f9178f29-327a-4b87-b505-9a750a3f52d0 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:17 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:17.377 2 INFO neutron.agent.securitygroups_rpc [None req-9798c79a-b835-452b-b3e7-ba6f51410008 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:56:17 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:17.482 2 INFO neutron.agent.securitygroups_rpc [None req-266d671f-bdaf-4cc0-a88f-fea21e1850b2 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:17 localhost nova_compute[282206]: 2026-02-23 09:56:17.747 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:17 localhost nova_compute[282206]: 2026-02-23 09:56:17.879 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:17 localhost nova_compute[282206]: 2026-02-23 09:56:17.881 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:18 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:18.323 2 INFO neutron.agent.securitygroups_rpc [None req-aa9784b4-8658-4ac5-a544-4839237cb0a4 6d15d44765db469a9e04a32fb56dcff2 2ac6a6009ea84eb99f60bd242e459002 - - default default] Security group member updated ['917bfa8c-752a-4a55-9acc-5ce6144207b4']#033[00m Feb 23 04:56:18 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:18.733 2 INFO neutron.agent.securitygroups_rpc [None req-40589676-1ef1-47e5-81ec-92f9fb0c6844 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:56:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:56:18 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:56:18 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:56:18 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:56:19 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:19.212 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:56:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.188 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.188 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.189 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.189 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:56:21 localhost ovn_controller[157695]: 2026-02-23T09:56:21Z|00160|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.470 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:56:21 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:21.787 265541 INFO neutron.agent.linux.ip_lib [None req-821e9013-c504-4be4-94d4-01337e729fb3 - - - - - -] Device tap372a9673-6c cannot be used as it has no MAC address#033[00m Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.842 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:21 localhost kernel: device tap372a9673-6c entered promiscuous mode Feb 23 04:56:21 localhost NetworkManager[5974]: [1771840581.8519] manager: (tap372a9673-6c): new Generic device (/org/freedesktop/NetworkManager/Devices/29) Feb 23 04:56:21 localhost systemd-udevd[311828]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.855 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:21 localhost ovn_controller[157695]: 2026-02-23T09:56:21Z|00161|binding|INFO|Claiming lport 372a9673-6c0e-49dd-9d35-ac2275c153ff for this chassis. Feb 23 04:56:21 localhost ovn_controller[157695]: 2026-02-23T09:56:21Z|00162|binding|INFO|372a9673-6c0e-49dd-9d35-ac2275c153ff: Claiming unknown Feb 23 04:56:21 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:21.866 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3af6da47-311a-4978-bae3-0b17a56bce02, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=372a9673-6c0e-49dd-9d35-ac2275c153ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:21 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:21.867 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 372a9673-6c0e-49dd-9d35-ac2275c153ff in datapath 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448 bound to our chassis#033[00m Feb 23 04:56:21 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:21.869 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:21 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:21.870 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce18b77-18f8-4ea5-bb39-80b2a593e706]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:21 localhost journal[231253]: ethtool ioctl error on tap372a9673-6c: No such device Feb 23 04:56:21 localhost ovn_controller[157695]: 2026-02-23T09:56:21Z|00163|binding|INFO|Setting lport 372a9673-6c0e-49dd-9d35-ac2275c153ff ovn-installed in OVS Feb 23 04:56:21 localhost ovn_controller[157695]: 2026-02-23T09:56:21Z|00164|binding|INFO|Setting lport 372a9673-6c0e-49dd-9d35-ac2275c153ff up in Southbound Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.884 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:21 localhost journal[231253]: ethtool ioctl error on tap372a9673-6c: No such device Feb 23 04:56:21 localhost journal[231253]: ethtool ioctl error on tap372a9673-6c: No such device Feb 23 04:56:21 localhost journal[231253]: ethtool ioctl error on tap372a9673-6c: No such device Feb 23 04:56:21 localhost journal[231253]: ethtool ioctl error on tap372a9673-6c: No such device Feb 23 04:56:21 localhost journal[231253]: ethtool ioctl error on tap372a9673-6c: No such device Feb 23 04:56:21 localhost journal[231253]: ethtool ioctl error on tap372a9673-6c: No such device Feb 23 04:56:21 localhost journal[231253]: ethtool ioctl error on tap372a9673-6c: No such device Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.926 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:21 localhost nova_compute[282206]: 2026-02-23 09:56:21.954 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:22.184 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port de3bbf63-31eb-40e8-b51e-d7191f3813e3 with type ""#033[00m Feb 23 04:56:22 localhost ovn_controller[157695]: 2026-02-23T09:56:22Z|00165|binding|INFO|Removing iface tap372a9673-6c ovn-installed in OVS Feb 23 04:56:22 localhost ovn_controller[157695]: 2026-02-23T09:56:22Z|00166|binding|INFO|Removing lport 372a9673-6c0e-49dd-9d35-ac2275c153ff ovn-installed in OVS Feb 23 04:56:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:22.186 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3af6da47-311a-4978-bae3-0b17a56bce02, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=372a9673-6c0e-49dd-9d35-ac2275c153ff) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:22 localhost nova_compute[282206]: 2026-02-23 09:56:22.186 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:22.188 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 372a9673-6c0e-49dd-9d35-ac2275c153ff in datapath 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448 unbound from our chassis#033[00m Feb 23 04:56:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:22.191 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:22.191 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[e74a35cf-a7e0-4097-b28d-bfc67f0dbfaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:22 localhost nova_compute[282206]: 2026-02-23 09:56:22.196 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:22 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:22.573 2 INFO neutron.agent.securitygroups_rpc [None req-0e752e7e-c507-42b6-b334-815c25dce29c 6d15d44765db469a9e04a32fb56dcff2 2ac6a6009ea84eb99f60bd242e459002 - - default default] Security group member updated ['917bfa8c-752a-4a55-9acc-5ce6144207b4']#033[00m Feb 23 04:56:22 localhost podman[311899]: Feb 23 04:56:22 localhost podman[311899]: 2026-02-23 09:56:22.716011627 +0000 UTC m=+0.077397421 container create db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:56:22 localhost systemd[1]: Started libpod-conmon-db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a.scope. Feb 23 04:56:22 localhost nova_compute[282206]: 2026-02-23 09:56:22.761 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:56:22 localhost systemd[1]: Started libcrun container. Feb 23 04:56:22 localhost nova_compute[282206]: 2026-02-23 09:56:22.776 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:56:22 localhost nova_compute[282206]: 2026-02-23 09:56:22.777 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:56:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6fd10a276b25a13959ac50d6133201d49913bb2a1ba87afd357e7af4f3a8b1bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:56:22 localhost podman[311899]: 2026-02-23 09:56:22.682399039 +0000 UTC m=+0.043784773 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:56:22 localhost podman[311899]: 2026-02-23 09:56:22.790114867 +0000 UTC m=+0.151500611 container init db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:56:22 localhost podman[311899]: 2026-02-23 09:56:22.800311973 +0000 UTC m=+0.161697707 container start db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216) Feb 23 04:56:22 localhost dnsmasq[311917]: started, version 2.85 cachesize 150 Feb 23 04:56:22 localhost dnsmasq[311917]: DNS service limited to local subnets Feb 23 04:56:22 localhost dnsmasq[311917]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:56:22 localhost dnsmasq[311917]: warning: no upstream servers configured Feb 23 04:56:22 localhost dnsmasq-dhcp[311917]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:56:22 localhost dnsmasq[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/addn_hosts - 0 addresses Feb 23 04:56:22 localhost dnsmasq-dhcp[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/host Feb 23 04:56:22 localhost dnsmasq-dhcp[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/opts Feb 23 04:56:22 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:22.903 265541 INFO neutron.agent.dhcp.agent [None req-1ccbce1c-4cea-49df-9178-1b829f03f556 - - - - - -] DHCP configuration for ports {'9bb465c3-7b01-4dc3-9b6f-a528439f0d87'} is completed#033[00m Feb 23 04:56:22 localhost nova_compute[282206]: 2026-02-23 09:56:22.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:22 localhost kernel: device tap372a9673-6c left promiscuous mode Feb 23 04:56:22 localhost nova_compute[282206]: 2026-02-23 09:56:22.917 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:56:22 localhost nova_compute[282206]: 2026-02-23 09:56:22.928 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:23 localhost dnsmasq[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/addn_hosts - 0 addresses Feb 23 04:56:23 localhost dnsmasq-dhcp[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/host Feb 23 04:56:23 localhost dnsmasq-dhcp[311917]: read /var/lib/neutron/dhcp/05d93df9-29e9-48b0-9b9e-8c7a4eaa7448/opts Feb 23 04:56:23 localhost podman[311935]: 2026-02-23 09:56:23.110948349 +0000 UTC m=+0.061418359 container kill db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent [None req-821e9013-c504-4be4-94d4-01337e729fb3 - - - - - -] Unable to reload_allocations dhcp for 05d93df9-29e9-48b0-9b9e-8c7a4eaa7448.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap372a9673-6c not found in namespace qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448. Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent return fut.result() Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent return self.__get_result() Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent raise self._exception Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap372a9673-6c not found in namespace qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448. Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.132 265541 ERROR neutron.agent.dhcp.agent #033[00m Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.139 265541 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Feb 23 04:56:23 localhost ovn_controller[157695]: 2026-02-23T09:56:23Z|00167|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:23 localhost nova_compute[282206]: 2026-02-23 09:56:23.310 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.400 265541 INFO neutron.agent.dhcp.agent [None req-dee707e2-4fcb-4925-9852-1b2d562dd6b0 - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 23 04:56:23 localhost dnsmasq[311917]: exiting on receipt of SIGTERM Feb 23 04:56:23 localhost podman[311965]: 2026-02-23 09:56:23.546888358 +0000 UTC m=+0.042882956 container kill db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:23 localhost systemd[1]: libpod-db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a.scope: Deactivated successfully. Feb 23 04:56:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:56:23 localhost podman[311980]: 2026-02-23 09:56:23.626128006 +0000 UTC m=+0.061878663 container died db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:23 localhost podman[311980]: 2026-02-23 09:56:23.679403352 +0000 UTC m=+0.115153959 container remove db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-05d93df9-29e9-48b0-9b9e-8c7a4eaa7448, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:56:23 localhost podman[311991]: 2026-02-23 09:56:23.693847518 +0000 UTC m=+0.119909735 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:56:23 localhost podman[311991]: 2026-02-23 09:56:23.705272902 +0000 UTC m=+0.131335069 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:56:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:23.706 265541 INFO neutron.agent.dhcp.agent [None req-96683e3f-c935-4255-96f4-59816f0e3683 - - - - - -] Synchronizing state complete#033[00m Feb 23 04:56:23 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:56:23 localhost systemd[1]: var-lib-containers-storage-overlay-6fd10a276b25a13959ac50d6133201d49913bb2a1ba87afd357e7af4f3a8b1bb-merged.mount: Deactivated successfully. Feb 23 04:56:23 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a-userdata-shm.mount: Deactivated successfully. Feb 23 04:56:23 localhost systemd[1]: run-netns-qdhcp\x2d05d93df9\x2d29e9\x2d48b0\x2d9b9e\x2d8c7a4eaa7448.mount: Deactivated successfully. Feb 23 04:56:23 localhost systemd[1]: libpod-conmon-db7eafbc6ed283ccfebcf7b59820659e7db729ce0b671b853472707c4601cc8a.scope: Deactivated successfully. Feb 23 04:56:24 localhost nova_compute[282206]: 2026-02-23 09:56:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:24 localhost nova_compute[282206]: 2026-02-23 09:56:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:24 localhost nova_compute[282206]: 2026-02-23 09:56:24.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:56:25 localhost nova_compute[282206]: 2026-02-23 09:56:25.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:25 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:25.537 2 INFO neutron.agent.securitygroups_rpc [None req-7a66e0e4-687a-49c0-b7a3-b39df7d3f4b0 70f605e811404c4bb9fe49c02ce24bf3 a2f9492758b148768734fafb039e58db - - default default] Security group member updated ['a4d30edc-cb55-4200-8dd2-93ea986a3cd5']#033[00m Feb 23 04:56:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:56:25 localhost podman[312029]: 2026-02-23 09:56:25.912339519 +0000 UTC m=+0.081073786 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, vcs-type=git, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.openshift.expose-services=, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container) Feb 23 04:56:25 localhost podman[312029]: 2026-02-23 09:56:25.92822739 +0000 UTC m=+0.096961677 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git) Feb 23 04:56:25 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:56:26 localhost nova_compute[282206]: 2026-02-23 09:56:26.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:26 localhost nova_compute[282206]: 2026-02-23 09:56:26.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:26 localhost nova_compute[282206]: 2026-02-23 09:56:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:26 localhost ovn_controller[157695]: 2026-02-23T09:56:26Z|00168|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:26 localhost sshd[312049]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:56:26 localhost nova_compute[282206]: 2026-02-23 09:56:26.276 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.080 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.080 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.080 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:56:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:56:27 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/194672215' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.523 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.601 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.602 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.829 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.830 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11384MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.831 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.831 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.884 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.884 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.884 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:27 localhost nova_compute[282206]: 2026-02-23 09:56:27.920 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:56:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:56:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2335027054' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:56:28 localhost nova_compute[282206]: 2026-02-23 09:56:28.351 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:56:28 localhost nova_compute[282206]: 2026-02-23 09:56:28.358 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:56:28 localhost nova_compute[282206]: 2026-02-23 09:56:28.378 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:56:28 localhost nova_compute[282206]: 2026-02-23 09:56:28.380 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:56:28 localhost nova_compute[282206]: 2026-02-23 09:56:28.381 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:56:28 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:28.788 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:09:8b 10.100.0.19 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-f6f90c3e-e9fc-4b4d-8000-6715492c6006', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6f90c3e-e9fc-4b4d-8000-6715492c6006', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=85b3f5b6-1b29-412a-9c40-de284e163599, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b34126a6-b855-4d0f-af32-431b42ec89f3) old=Port_Binding(mac=['fa:16:3e:23:09:8b 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-f6f90c3e-e9fc-4b4d-8000-6715492c6006', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6f90c3e-e9fc-4b4d-8000-6715492c6006', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:28 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:28.790 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b34126a6-b855-4d0f-af32-431b42ec89f3 in datapath f6f90c3e-e9fc-4b4d-8000-6715492c6006 updated#033[00m Feb 23 04:56:28 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:28.793 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6f90c3e-e9fc-4b4d-8000-6715492c6006, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:56:28 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:28.794 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[3782a614-483a-4527-8690-3d7c631bd7eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:29 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:29.077 265541 INFO neutron.agent.linux.ip_lib [None req-9053744e-16fe-4544-acb7-09db1134d43f - - - - - -] Device tap82d3eaca-cb cannot be used as it has no MAC address#033[00m Feb 23 04:56:29 localhost nova_compute[282206]: 2026-02-23 09:56:29.134 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:29 localhost kernel: device tap82d3eaca-cb entered promiscuous mode Feb 23 04:56:29 localhost NetworkManager[5974]: [1771840589.1421] manager: (tap82d3eaca-cb): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Feb 23 04:56:29 localhost nova_compute[282206]: 2026-02-23 09:56:29.143 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:29 localhost systemd-udevd[312105]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:56:29 localhost nova_compute[282206]: 2026-02-23 09:56:29.148 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:29 localhost ovn_controller[157695]: 2026-02-23T09:56:29Z|00169|binding|INFO|Claiming lport 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 for this chassis. Feb 23 04:56:29 localhost ovn_controller[157695]: 2026-02-23T09:56:29Z|00170|binding|INFO|82d3eaca-cb68-45a9-bf57-61ec3eca3d02: Claiming unknown Feb 23 04:56:29 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:29.163 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e035a433-4457-40d5-9858-3562dbadafb2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=82d3eaca-cb68-45a9-bf57-61ec3eca3d02) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:29 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:29.165 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 in datapath 2ad34212-24f1-4cd3-b44f-f6713c550041 bound to our chassis#033[00m Feb 23 04:56:29 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:29.168 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2ad34212-24f1-4cd3-b44f-f6713c550041 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:29 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:29.169 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[35d603bf-0982-44fc-8443-55d39408bfc8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:29 localhost journal[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device Feb 23 04:56:29 localhost ovn_controller[157695]: 2026-02-23T09:56:29Z|00171|binding|INFO|Setting lport 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 ovn-installed in OVS Feb 23 04:56:29 localhost ovn_controller[157695]: 2026-02-23T09:56:29Z|00172|binding|INFO|Setting lport 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 up in Southbound Feb 23 04:56:29 localhost journal[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device Feb 23 04:56:29 localhost nova_compute[282206]: 2026-02-23 09:56:29.176 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:29 localhost journal[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device Feb 23 04:56:29 localhost journal[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device Feb 23 04:56:29 localhost journal[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device Feb 23 04:56:29 localhost journal[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device Feb 23 04:56:29 localhost journal[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device Feb 23 04:56:29 localhost journal[231253]: ethtool ioctl error on tap82d3eaca-cb: No such device Feb 23 04:56:29 localhost nova_compute[282206]: 2026-02-23 09:56:29.218 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:29 localhost nova_compute[282206]: 2026-02-23 09:56:29.251 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:30 localhost podman[312176]: Feb 23 04:56:30 localhost podman[312176]: 2026-02-23 09:56:30.069211777 +0000 UTC m=+0.092214360 container create b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:56:30 localhost systemd[1]: Started libpod-conmon-b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182.scope. Feb 23 04:56:30 localhost podman[312176]: 2026-02-23 09:56:30.025620771 +0000 UTC m=+0.048623394 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:56:30 localhost systemd[1]: Started libcrun container. Feb 23 04:56:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/acad2e7ba45657c1d140e9cfc4440102975b73a8b85654d17bed266ce7bbe0a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:56:30 localhost podman[312176]: 2026-02-23 09:56:30.159991052 +0000 UTC m=+0.182993635 container init b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0) Feb 23 04:56:30 localhost podman[312176]: 2026-02-23 09:56:30.169845606 +0000 UTC m=+0.192848189 container start b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:30 localhost dnsmasq[312194]: started, version 2.85 cachesize 150 Feb 23 04:56:30 localhost dnsmasq[312194]: DNS service limited to local subnets Feb 23 04:56:30 localhost dnsmasq[312194]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:56:30 localhost dnsmasq[312194]: warning: no upstream servers configured Feb 23 04:56:30 localhost dnsmasq-dhcp[312194]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:56:30 localhost dnsmasq[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/addn_hosts - 0 addresses Feb 23 04:56:30 localhost dnsmasq-dhcp[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/host Feb 23 04:56:30 localhost dnsmasq-dhcp[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/opts Feb 23 04:56:30 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:30.311 265541 INFO neutron.agent.dhcp.agent [None req-1ace6790-003c-4f84-8ec5-da0bdccc3811 - - - - - -] DHCP configuration for ports {'90d73603-9275-4864-aac8-38669974f0c1'} is completed#033[00m Feb 23 04:56:30 localhost nova_compute[282206]: 2026-02-23 09:56:30.382 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:56:30 localhost dnsmasq[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/addn_hosts - 0 addresses Feb 23 04:56:30 localhost dnsmasq-dhcp[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/host Feb 23 04:56:30 localhost dnsmasq-dhcp[312194]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/opts Feb 23 04:56:30 localhost podman[312210]: 2026-02-23 09:56:30.493359562 +0000 UTC m=+0.058045025 container kill b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216) Feb 23 04:56:30 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:30.763 265541 INFO neutron.agent.dhcp.agent [None req-d36b9e67-9dda-46bc-8595-5ffd7018ed96 - - - - - -] DHCP configuration for ports {'82d3eaca-cb68-45a9-bf57-61ec3eca3d02', '90d73603-9275-4864-aac8-38669974f0c1'} is completed#033[00m Feb 23 04:56:30 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:30.909 2 INFO neutron.agent.securitygroups_rpc [None req-51f4eed0-aded-49bd-977f-d8680aeec69e 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:31 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:31.239 2 INFO neutron.agent.securitygroups_rpc [None req-2890ed0d-a00f-4342-b403-59e82c71dfe3 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:56:31 localhost ovn_controller[157695]: 2026-02-23T09:56:31Z|00173|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:31 localhost nova_compute[282206]: 2026-02-23 09:56:31.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:32 localhost ovn_controller[157695]: 2026-02-23T09:56:32Z|00174|binding|INFO|Removing iface tap82d3eaca-cb ovn-installed in OVS Feb 23 04:56:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:32.436 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 567e9ecf-e387-4a72-9734-88b8c159914d with type ""#033[00m Feb 23 04:56:32 localhost ovn_controller[157695]: 2026-02-23T09:56:32Z|00175|binding|INFO|Removing lport 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 ovn-installed in OVS Feb 23 04:56:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:32.437 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e035a433-4457-40d5-9858-3562dbadafb2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=82d3eaca-cb68-45a9-bf57-61ec3eca3d02) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:32.439 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 82d3eaca-cb68-45a9-bf57-61ec3eca3d02 in datapath 2ad34212-24f1-4cd3-b44f-f6713c550041 unbound from our chassis#033[00m Feb 23 04:56:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:32.443 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ad34212-24f1-4cd3-b44f-f6713c550041, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:56:32 localhost nova_compute[282206]: 2026-02-23 09:56:32.444 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:32.444 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[75c660df-ce9c-43c8-a337-95f303c52707]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:32 localhost dnsmasq[312194]: exiting on receipt of SIGTERM Feb 23 04:56:32 localhost podman[312246]: 2026-02-23 09:56:32.554654265 +0000 UTC m=+0.057868298 container kill b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:56:32 localhost systemd[1]: libpod-b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182.scope: Deactivated successfully. Feb 23 04:56:32 localhost podman[312260]: 2026-02-23 09:56:32.623849284 +0000 UTC m=+0.051796531 container died b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 04:56:32 localhost systemd[1]: tmp-crun.YTYRfG.mount: Deactivated successfully. Feb 23 04:56:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182-userdata-shm.mount: Deactivated successfully. Feb 23 04:56:32 localhost podman[312260]: 2026-02-23 09:56:32.665734598 +0000 UTC m=+0.093681815 container remove b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216) Feb 23 04:56:32 localhost systemd[1]: libpod-conmon-b9051af7e34c51a3cfc500658b9baedafff338976b75ea91a3e471246f719182.scope: Deactivated successfully. Feb 23 04:56:32 localhost kernel: device tap82d3eaca-cb left promiscuous mode Feb 23 04:56:32 localhost nova_compute[282206]: 2026-02-23 09:56:32.680 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:32 localhost nova_compute[282206]: 2026-02-23 09:56:32.689 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.705 265541 INFO neutron.agent.dhcp.agent [None req-96683e3f-c935-4255-96f4-59816f0e3683 - - - - - -] Synchronizing state#033[00m Feb 23 04:56:32 localhost nova_compute[282206]: 2026-02-23 09:56:32.915 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.962 265541 INFO neutron.agent.dhcp.agent [None req-274e03b2-f74d-44f1-b54c-1efda683bbca - - - - - -] All active networks have been fetched through RPC.#033[00m Feb 23 04:56:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.963 265541 INFO neutron.agent.dhcp.agent [-] Starting network 2ad34212-24f1-4cd3-b44f-f6713c550041 dhcp configuration#033[00m Feb 23 04:56:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.966 265541 INFO neutron.agent.dhcp.agent [-] Starting network f240f7f9-ece5-4389-81ed-fec84e1bb5f7 dhcp configuration#033[00m Feb 23 04:56:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:32.967 265541 INFO neutron.agent.dhcp.agent [-] Finished network f240f7f9-ece5-4389-81ed-fec84e1bb5f7 dhcp configuration#033[00m Feb 23 04:56:32 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:32.980 2 INFO neutron.agent.securitygroups_rpc [None req-5d3e7d76-e87c-4157-8484-7f31d3f7ba7b 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:32 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:32.988 2 INFO neutron.agent.securitygroups_rpc [None req-f8235bf5-0ea1-4901-953e-20a5808b9d67 70f605e811404c4bb9fe49c02ce24bf3 a2f9492758b148768734fafb039e58db - - default default] Security group member updated ['a4d30edc-cb55-4200-8dd2-93ea986a3cd5']#033[00m Feb 23 04:56:33 localhost systemd[1]: var-lib-containers-storage-overlay-acad2e7ba45657c1d140e9cfc4440102975b73a8b85654d17bed266ce7bbe0a8-merged.mount: Deactivated successfully. Feb 23 04:56:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:33.644 265541 INFO neutron.agent.linux.ip_lib [None req-14238360-846a-4748-b5df-245855e99c07 - - - - - -] Device tapf95ad720-f8 cannot be used as it has no MAC address#033[00m Feb 23 04:56:33 localhost nova_compute[282206]: 2026-02-23 09:56:33.706 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:33 localhost kernel: device tapf95ad720-f8 entered promiscuous mode Feb 23 04:56:33 localhost NetworkManager[5974]: [1771840593.7135] manager: (tapf95ad720-f8): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Feb 23 04:56:33 localhost ovn_controller[157695]: 2026-02-23T09:56:33Z|00176|binding|INFO|Claiming lport f95ad720-f864-4c58-8f6b-16288980b877 for this chassis. Feb 23 04:56:33 localhost ovn_controller[157695]: 2026-02-23T09:56:33Z|00177|binding|INFO|f95ad720-f864-4c58-8f6b-16288980b877: Claiming unknown Feb 23 04:56:33 localhost nova_compute[282206]: 2026-02-23 09:56:33.714 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:33 localhost systemd-udevd[312295]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:56:33 localhost nova_compute[282206]: 2026-02-23 09:56:33.722 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:33 localhost ovn_controller[157695]: 2026-02-23T09:56:33Z|00178|binding|INFO|Setting lport f95ad720-f864-4c58-8f6b-16288980b877 ovn-installed in OVS Feb 23 04:56:33 localhost ovn_controller[157695]: 2026-02-23T09:56:33Z|00179|binding|INFO|Setting lport f95ad720-f864-4c58-8f6b-16288980b877 up in Southbound Feb 23 04:56:33 localhost nova_compute[282206]: 2026-02-23 09:56:33.726 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:33.723 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e035a433-4457-40d5-9858-3562dbadafb2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f95ad720-f864-4c58-8f6b-16288980b877) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:33.726 163572 INFO neutron.agent.ovn.metadata.agent [-] Port f95ad720-f864-4c58-8f6b-16288980b877 in datapath 2ad34212-24f1-4cd3-b44f-f6713c550041 bound to our chassis#033[00m Feb 23 04:56:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:33.728 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2ad34212-24f1-4cd3-b44f-f6713c550041 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:33.729 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[dc5ff5af-413c-4b35-b5bd-a04bd4247ae2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:33 localhost journal[231253]: ethtool ioctl error on tapf95ad720-f8: No such device Feb 23 04:56:33 localhost journal[231253]: ethtool ioctl error on tapf95ad720-f8: No such device Feb 23 04:56:33 localhost nova_compute[282206]: 2026-02-23 09:56:33.754 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:33 localhost journal[231253]: ethtool ioctl error on tapf95ad720-f8: No such device Feb 23 04:56:33 localhost journal[231253]: ethtool ioctl error on tapf95ad720-f8: No such device Feb 23 04:56:33 localhost journal[231253]: ethtool ioctl error on tapf95ad720-f8: No such device Feb 23 04:56:33 localhost journal[231253]: ethtool ioctl error on tapf95ad720-f8: No such device Feb 23 04:56:33 localhost journal[231253]: ethtool ioctl error on tapf95ad720-f8: No such device Feb 23 04:56:33 localhost journal[231253]: ethtool ioctl error on tapf95ad720-f8: No such device Feb 23 04:56:33 localhost nova_compute[282206]: 2026-02-23 09:56:33.797 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:33 localhost nova_compute[282206]: 2026-02-23 09:56:33.821 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:34 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:34.327 2 INFO neutron.agent.securitygroups_rpc [None req-3413d0d6-8a0f-497e-801e-d0383982e452 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:34 localhost podman[312366]: Feb 23 04:56:34 localhost podman[312366]: 2026-02-23 09:56:34.592209297 +0000 UTC m=+0.087680129 container create f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0) Feb 23 04:56:34 localhost systemd[1]: Started libpod-conmon-f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783.scope. Feb 23 04:56:34 localhost systemd[1]: tmp-crun.PFO2Gh.mount: Deactivated successfully. Feb 23 04:56:34 localhost podman[312366]: 2026-02-23 09:56:34.548647011 +0000 UTC m=+0.044117863 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:56:34 localhost systemd[1]: Started libcrun container. Feb 23 04:56:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1f03d05a0e5a4232a0d0df03e37aa60cfe00098407af5ed8503081a6f008fa81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:56:34 localhost podman[312366]: 2026-02-23 09:56:34.666260935 +0000 UTC m=+0.161731777 container init f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:56:34 localhost podman[312366]: 2026-02-23 09:56:34.680223956 +0000 UTC m=+0.175694798 container start f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:56:34 localhost dnsmasq[312385]: started, version 2.85 cachesize 150 Feb 23 04:56:34 localhost dnsmasq[312385]: DNS service limited to local subnets Feb 23 04:56:34 localhost dnsmasq[312385]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:56:34 localhost dnsmasq[312385]: warning: no upstream servers configured Feb 23 04:56:34 localhost dnsmasq-dhcp[312385]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:56:34 localhost dnsmasq[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/addn_hosts - 0 addresses Feb 23 04:56:34 localhost dnsmasq-dhcp[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/host Feb 23 04:56:34 localhost dnsmasq-dhcp[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/opts Feb 23 04:56:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:34.739 265541 INFO neutron.agent.dhcp.agent [None req-14238360-846a-4748-b5df-245855e99c07 - - - - - -] Finished network 2ad34212-24f1-4cd3-b44f-f6713c550041 dhcp configuration#033[00m Feb 23 04:56:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:34.740 265541 INFO neutron.agent.dhcp.agent [None req-274e03b2-f74d-44f1-b54c-1efda683bbca - - - - - -] Synchronizing state complete#033[00m Feb 23 04:56:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:34.823 265541 INFO neutron.agent.dhcp.agent [None req-38eb3c3a-4c0d-4b0e-a13e-9cd621e7421d - - - - - -] DHCP configuration for ports {'90d73603-9275-4864-aac8-38669974f0c1'} is completed#033[00m Feb 23 04:56:34 localhost dnsmasq[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/addn_hosts - 0 addresses Feb 23 04:56:34 localhost dnsmasq-dhcp[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/host Feb 23 04:56:34 localhost dnsmasq-dhcp[312385]: read /var/lib/neutron/dhcp/2ad34212-24f1-4cd3-b44f-f6713c550041/opts Feb 23 04:56:34 localhost podman[312402]: 2026-02-23 09:56:34.909621033 +0000 UTC m=+0.060547521 container kill f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:56:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:34.926 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.243 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 7a034147-002e-4bd4-bc87-146a85b374e8 with type ""#033[00m Feb 23 04:56:35 localhost ovn_controller[157695]: 2026-02-23T09:56:35Z|00180|binding|INFO|Removing iface tapf95ad720-f8 ovn-installed in OVS Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.244 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ad34212-24f1-4cd3-b44f-f6713c550041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e035a433-4457-40d5-9858-3562dbadafb2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f95ad720-f864-4c58-8f6b-16288980b877) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.247 163572 INFO neutron.agent.ovn.metadata.agent [-] Port f95ad720-f864-4c58-8f6b-16288980b877 in datapath 2ad34212-24f1-4cd3-b44f-f6713c550041 unbound from our chassis#033[00m Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.250 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2ad34212-24f1-4cd3-b44f-f6713c550041 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:35 localhost ovn_controller[157695]: 2026-02-23T09:56:35Z|00181|binding|INFO|Removing lport f95ad720-f864-4c58-8f6b-16288980b877 ovn-installed in OVS Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.251 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[196ea26e-b02e-4694-971c-4194d867ae50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:35 localhost nova_compute[282206]: 2026-02-23 09:56:35.252 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:35 localhost dnsmasq[312385]: exiting on receipt of SIGTERM Feb 23 04:56:35 localhost podman[312441]: 2026-02-23 09:56:35.289326775 +0000 UTC m=+0.061161951 container kill f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:56:35 localhost systemd[1]: libpod-f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783.scope: Deactivated successfully. Feb 23 04:56:35 localhost podman[312456]: 2026-02-23 09:56:35.367392567 +0000 UTC m=+0.061534882 container died f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2) Feb 23 04:56:35 localhost podman[312456]: 2026-02-23 09:56:35.397019942 +0000 UTC m=+0.091162217 container cleanup f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 04:56:35 localhost systemd[1]: libpod-conmon-f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783.scope: Deactivated successfully. Feb 23 04:56:35 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:35.442 2 INFO neutron.agent.securitygroups_rpc [None req-aa725386-7116-408a-a619-1f9f73c010a8 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:35 localhost podman[312457]: 2026-02-23 09:56:35.44551574 +0000 UTC m=+0.133950139 container remove f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2ad34212-24f1-4cd3-b44f-f6713c550041, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:56:35 localhost kernel: device tapf95ad720-f8 left promiscuous mode Feb 23 04:56:35 localhost nova_compute[282206]: 2026-02-23 09:56:35.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:35 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:35.462 2 INFO neutron.agent.securitygroups_rpc [None req-ff1e8db8-3064-40a0-abdd-bddb9c09e449 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:56:35 localhost nova_compute[282206]: 2026-02-23 09:56:35.478 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:35 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:35.510 265541 INFO neutron.agent.dhcp.agent [None req-33006b32-0a26-43fd-a576-94e8c8a0331e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:35 localhost ovn_controller[157695]: 2026-02-23T09:56:35Z|00182|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:35 localhost systemd[1]: tmp-crun.BbSzKM.mount: Deactivated successfully. Feb 23 04:56:35 localhost systemd[1]: var-lib-containers-storage-overlay-1f03d05a0e5a4232a0d0df03e37aa60cfe00098407af5ed8503081a6f008fa81-merged.mount: Deactivated successfully. Feb 23 04:56:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f08fa3c657fc5e00bd3d993ad47b84408d9b498838a5f2f3281a53587a59f783-userdata-shm.mount: Deactivated successfully. Feb 23 04:56:35 localhost systemd[1]: run-netns-qdhcp\x2d2ad34212\x2d24f1\x2d4cd3\x2db44f\x2df6713c550041.mount: Deactivated successfully. Feb 23 04:56:35 localhost nova_compute[282206]: 2026-02-23 09:56:35.600 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:35 localhost podman[312501]: 2026-02-23 09:56:35.747140359 +0000 UTC m=+0.057305211 container kill 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:56:35 localhost dnsmasq[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/addn_hosts - 0 addresses Feb 23 04:56:35 localhost dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/host Feb 23 04:56:35 localhost dnsmasq-dhcp[310490]: read /var/lib/neutron/dhcp/5b164f5a-6aae-4898-a6ea-a1c77a8cf652/opts Feb 23 04:56:35 localhost ovn_controller[157695]: 2026-02-23T09:56:35Z|00183|binding|INFO|Releasing lport fea38170-0626-427b-8a36-b82b8e008ab6 from this chassis (sb_readonly=0) Feb 23 04:56:35 localhost ovn_controller[157695]: 2026-02-23T09:56:35Z|00184|binding|INFO|Setting lport fea38170-0626-427b-8a36-b82b8e008ab6 down in Southbound Feb 23 04:56:35 localhost nova_compute[282206]: 2026-02-23 09:56:35.963 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:35 localhost kernel: device tapfea38170-06 left promiscuous mode Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.976 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5b164f5a-6aae-4898-a6ea-a1c77a8cf652', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b164f5a-6aae-4898-a6ea-a1c77a8cf652', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5632ff1108264def864ca9b5473cb716', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cca3f636-88c2-4a23-a28f-aa045d27b076, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fea38170-0626-427b-8a36-b82b8e008ab6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.978 163572 INFO neutron.agent.ovn.metadata.agent [-] Port fea38170-0626-427b-8a36-b82b8e008ab6 in datapath 5b164f5a-6aae-4898-a6ea-a1c77a8cf652 unbound from our chassis#033[00m Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.982 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:56:35 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:35.983 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[e8857760-669a-4443-a1a1-6ff0f740ca92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:35 localhost nova_compute[282206]: 2026-02-23 09:56:35.988 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:36 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:36.513 2 INFO neutron.agent.securitygroups_rpc [None req-1fda4918-c68f-4c9d-a41e-c71c19c42e64 f6ed429d4dee4c5abef411f5952801ef 7760b87546484c7693fd48206e06d3f8 - - default default] Security group member updated ['1a09a3fa-6a99-44c4-8684-508fe117a320']#033[00m Feb 23 04:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:56:36 localhost systemd[1]: tmp-crun.k7X80M.mount: Deactivated successfully. Feb 23 04:56:36 localhost podman[312525]: 2026-02-23 09:56:36.923159993 +0000 UTC m=+0.095212823 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:56:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:36 localhost podman[312525]: 2026-02-23 09:56:36.962529419 +0000 UTC m=+0.134582209 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:56:36 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:56:37 localhost podman[312524]: 2026-02-23 09:56:37.022400858 +0000 UTC m=+0.197000727 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller) Feb 23 04:56:37 localhost podman[312524]: 2026-02-23 09:56:37.121260863 +0000 UTC m=+0.295860762 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller) Feb 23 04:56:37 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:56:37 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:37.667 2 INFO neutron.agent.securitygroups_rpc [None req-0f5e541e-74ab-492f-8902-38d7300ec53b f6ed429d4dee4c5abef411f5952801ef 7760b87546484c7693fd48206e06d3f8 - - default default] Security group member updated ['1a09a3fa-6a99-44c4-8684-508fe117a320']#033[00m Feb 23 04:56:37 localhost systemd[1]: tmp-crun.IiNlwB.mount: Deactivated successfully. Feb 23 04:56:37 localhost nova_compute[282206]: 2026-02-23 09:56:37.917 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:39 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:39.084 2 INFO neutron.agent.securitygroups_rpc [None req-0564baea-5d79-415c-93b0-64b1a1c7383f d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:56:39 localhost ovn_controller[157695]: 2026-02-23T09:56:39Z|00185|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:39 localhost nova_compute[282206]: 2026-02-23 09:56:39.157 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:39 localhost podman[242954]: time="2026-02-23T09:56:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:56:39 localhost podman[242954]: @ - - [23/Feb/2026:09:56:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160729 "" "Go-http-client/1.1" Feb 23 04:56:39 localhost podman[242954]: @ - - [23/Feb/2026:09:56:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19763 "" "Go-http-client/1.1" Feb 23 04:56:39 localhost dnsmasq[310490]: exiting on receipt of SIGTERM Feb 23 04:56:39 localhost podman[312589]: 2026-02-23 09:56:39.689982756 +0000 UTC m=+0.060289904 container kill 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:56:39 localhost systemd[1]: libpod-877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905.scope: Deactivated successfully. Feb 23 04:56:39 localhost podman[312601]: 2026-02-23 09:56:39.756737387 +0000 UTC m=+0.055635239 container died 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:56:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905-userdata-shm.mount: Deactivated successfully. Feb 23 04:56:39 localhost podman[312601]: 2026-02-23 09:56:39.79048329 +0000 UTC m=+0.089381112 container cleanup 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:56:39 localhost systemd[1]: libpod-conmon-877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905.scope: Deactivated successfully. Feb 23 04:56:39 localhost podman[312608]: 2026-02-23 09:56:39.84065743 +0000 UTC m=+0.126824789 container remove 877649022c10c10c453bdf68b55a9c7d3447d25e11bf446b40e71f7097310905 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b164f5a-6aae-4898-a6ea-a1c77a8cf652, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:39 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:39.873 265541 INFO neutron.agent.dhcp.agent [None req-2c69bfeb-4dd0-4d65-83c8-716dcc5d0958 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:39 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:39.874 265541 INFO neutron.agent.dhcp.agent [None req-2c69bfeb-4dd0-4d65-83c8-716dcc5d0958 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:40 localhost systemd[1]: var-lib-containers-storage-overlay-930a766cf5e4b8a84c31286e57382a4788939504b12f986029e04d96b0f3a126-merged.mount: Deactivated successfully. Feb 23 04:56:40 localhost systemd[1]: run-netns-qdhcp\x2d5b164f5a\x2d6aae\x2d4898\x2da6ea\x2da1c77a8cf652.mount: Deactivated successfully. Feb 23 04:56:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:56:41 localhost podman[312629]: 2026-02-23 09:56:41.912552333 +0000 UTC m=+0.084813361 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:41 localhost podman[312629]: 2026-02-23 09:56:41.927400921 +0000 UTC m=+0.099661919 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 04:56:41 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:56:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:41 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:41.987 2 INFO neutron.agent.securitygroups_rpc [None req-00c7bab6-3ec4-412e-b4e0-8dc31e0d8362 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:56:42 localhost nova_compute[282206]: 2026-02-23 09:56:42.920 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:43 localhost openstack_network_exporter[245358]: ERROR 09:56:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:56:43 localhost openstack_network_exporter[245358]: Feb 23 04:56:43 localhost openstack_network_exporter[245358]: ERROR 09:56:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:56:43 localhost openstack_network_exporter[245358]: Feb 23 04:56:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:43.447 265541 INFO neutron.agent.linux.ip_lib [None req-c0a85ad2-4ddd-4097-ada4-a5dc6b3ee6e8 - - - - - -] Device tap556381c3-ac cannot be used as it has no MAC address#033[00m Feb 23 04:56:43 localhost nova_compute[282206]: 2026-02-23 09:56:43.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:43 localhost kernel: device tap556381c3-ac entered promiscuous mode Feb 23 04:56:43 localhost NetworkManager[5974]: [1771840603.4749] manager: (tap556381c3-ac): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Feb 23 04:56:43 localhost nova_compute[282206]: 2026-02-23 09:56:43.474 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:43 localhost ovn_controller[157695]: 2026-02-23T09:56:43Z|00186|binding|INFO|Claiming lport 556381c3-ace4-4128-82b0-c33c8ad0e1c9 for this chassis. Feb 23 04:56:43 localhost ovn_controller[157695]: 2026-02-23T09:56:43Z|00187|binding|INFO|556381c3-ace4-4128-82b0-c33c8ad0e1c9: Claiming unknown Feb 23 04:56:43 localhost systemd-udevd[312658]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:56:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:43.494 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-6d7119b8-7a22-4328-9884-4df90f2c3ebd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d7119b8-7a22-4328-9884-4df90f2c3ebd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0aaf8a49-34b1-4108-a38e-a49be6b7ace1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=556381c3-ace4-4128-82b0-c33c8ad0e1c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:43.495 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 556381c3-ace4-4128-82b0-c33c8ad0e1c9 in datapath 6d7119b8-7a22-4328-9884-4df90f2c3ebd bound to our chassis#033[00m Feb 23 04:56:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:43.497 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6d7119b8-7a22-4328-9884-4df90f2c3ebd or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:43 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:43.498 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[f10fe368-7b53-4397-97ee-c9d8fb5a36f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:43 localhost journal[231253]: ethtool ioctl error on tap556381c3-ac: No such device Feb 23 04:56:43 localhost nova_compute[282206]: 2026-02-23 09:56:43.505 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:43 localhost journal[231253]: ethtool ioctl error on tap556381c3-ac: No such device Feb 23 04:56:43 localhost ovn_controller[157695]: 2026-02-23T09:56:43Z|00188|binding|INFO|Setting lport 556381c3-ace4-4128-82b0-c33c8ad0e1c9 ovn-installed in OVS Feb 23 04:56:43 localhost ovn_controller[157695]: 2026-02-23T09:56:43Z|00189|binding|INFO|Setting lport 556381c3-ace4-4128-82b0-c33c8ad0e1c9 up in Southbound Feb 23 04:56:43 localhost nova_compute[282206]: 2026-02-23 09:56:43.510 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:43 localhost journal[231253]: ethtool ioctl error on tap556381c3-ac: No such device Feb 23 04:56:43 localhost nova_compute[282206]: 2026-02-23 09:56:43.513 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:43 localhost journal[231253]: ethtool ioctl error on tap556381c3-ac: No such device Feb 23 04:56:43 localhost journal[231253]: ethtool ioctl error on tap556381c3-ac: No such device Feb 23 04:56:43 localhost journal[231253]: ethtool ioctl error on tap556381c3-ac: No such device Feb 23 04:56:43 localhost journal[231253]: ethtool ioctl error on tap556381c3-ac: No such device Feb 23 04:56:43 localhost journal[231253]: ethtool ioctl error on tap556381c3-ac: No such device Feb 23 04:56:43 localhost nova_compute[282206]: 2026-02-23 09:56:43.551 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:43 localhost nova_compute[282206]: 2026-02-23 09:56:43.579 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:44 localhost podman[312729]: Feb 23 04:56:44 localhost podman[312729]: 2026-02-23 09:56:44.395000679 +0000 UTC m=+0.086549165 container create 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:56:44 localhost systemd[1]: Started libpod-conmon-4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd.scope. Feb 23 04:56:44 localhost podman[312729]: 2026-02-23 09:56:44.354194409 +0000 UTC m=+0.045742925 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:56:44 localhost systemd[1]: Started libcrun container. Feb 23 04:56:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b266334acb0e55f0426e207375e16e3c1c5eaa3a9f7c324de8884ec556d7b30d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:56:44 localhost podman[312742]: 2026-02-23 09:56:44.518122113 +0000 UTC m=+0.084030098 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Feb 23 04:56:44 localhost podman[312742]: 2026-02-23 09:56:44.523518849 +0000 UTC m=+0.089426804 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent) Feb 23 04:56:44 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:56:44 localhost podman[312729]: 2026-02-23 09:56:44.550828483 +0000 UTC m=+0.242376969 container init 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:44 localhost podman[312729]: 2026-02-23 09:56:44.560502312 +0000 UTC m=+0.252050808 container start 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:44 localhost dnsmasq[312766]: started, version 2.85 cachesize 150 Feb 23 04:56:44 localhost dnsmasq[312766]: DNS service limited to local subnets Feb 23 04:56:44 localhost dnsmasq[312766]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:56:44 localhost dnsmasq[312766]: warning: no upstream servers configured Feb 23 04:56:44 localhost dnsmasq-dhcp[312766]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:56:44 localhost dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 0 addresses Feb 23 04:56:44 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host Feb 23 04:56:44 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts Feb 23 04:56:45 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:45.031 265541 INFO neutron.agent.dhcp.agent [None req-6ee0dc6d-6a24-4d3b-b2fd-e00b24f3197f - - - - - -] DHCP configuration for ports {'771090fe-1a3e-436c-a027-4e2b743c323c'} is completed#033[00m Feb 23 04:56:45 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:45.650 2 INFO neutron.agent.securitygroups_rpc [None req-f79d3420-cd8f-4700-a586-57c3375d8a5c 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:45 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:45.694 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:45Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=36514155-4ae9-4996-bb88-a40fb9ad6bcf, ip_allocation=immediate, mac_address=fa:16:3e:92:f1:55, name=tempest-PortsTestJSON-1106137784, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:40Z, description=, dns_domain=, id=6d7119b8-7a22-4328-9884-4df90f2c3ebd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1304508516, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7415, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1405, status=ACTIVE, subnets=['ebe8b110-7353-4fff-bc4d-dbab752dab8b'], tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:42Z, vlan_transparent=None, network_id=6d7119b8-7a22-4328-9884-4df90f2c3ebd, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['709ad995-bfde-4096-a0b4-2ba30248a611'], standard_attr_id=1440, status=DOWN, tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:45Z on network 6d7119b8-7a22-4328-9884-4df90f2c3ebd#033[00m Feb 23 04:56:45 localhost dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 1 addresses Feb 23 04:56:45 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host Feb 23 04:56:45 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts Feb 23 04:56:45 localhost podman[312784]: 2026-02-23 09:56:45.90595029 +0000 UTC m=+0.065001959 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2) Feb 23 04:56:46 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:46.121 265541 INFO neutron.agent.dhcp.agent [None req-1a048b78-c850-4495-8445-cc78629209cc - - - - - -] DHCP configuration for ports {'36514155-4ae9-4996-bb88-a40fb9ad6bcf'} is completed#033[00m Feb 23 04:56:46 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:46.317 2 INFO neutron.agent.securitygroups_rpc [None req-23858e93-d857-4321-b908-702c515a7b92 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:46 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:46.605 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:46Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7a454d8b-c57f-41ac-a163-5c4d28169a90, ip_allocation=immediate, mac_address=fa:16:3e:18:a1:a7, name=tempest-PortsTestJSON-341249040, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:40Z, description=, dns_domain=, id=6d7119b8-7a22-4328-9884-4df90f2c3ebd, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1304508516, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7415, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1405, status=ACTIVE, subnets=['ebe8b110-7353-4fff-bc4d-dbab752dab8b'], tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:42Z, vlan_transparent=None, network_id=6d7119b8-7a22-4328-9884-4df90f2c3ebd, port_security_enabled=True, project_id=8532226521ac43ca82723a0b71168e03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['709ad995-bfde-4096-a0b4-2ba30248a611'], standard_attr_id=1442, status=DOWN, tags=[], tenant_id=8532226521ac43ca82723a0b71168e03, updated_at=2026-02-23T09:56:46Z on network 6d7119b8-7a22-4328-9884-4df90f2c3ebd#033[00m Feb 23 04:56:46 localhost dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 2 addresses Feb 23 04:56:46 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host Feb 23 04:56:46 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts Feb 23 04:56:46 localhost podman[312832]: 2026-02-23 09:56:46.849946295 +0000 UTC m=+0.058179947 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:56:46 localhost dnsmasq[310749]: exiting on receipt of SIGTERM Feb 23 04:56:46 localhost systemd[1]: libpod-2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1.scope: Deactivated successfully. Feb 23 04:56:46 localhost podman[312849]: 2026-02-23 09:56:46.908823554 +0000 UTC m=+0.069234799 container kill 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:56:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:46 localhost podman[312869]: 2026-02-23 09:56:46.984998478 +0000 UTC m=+0.064466672 container died 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 23 04:56:47 localhost systemd[1]: tmp-crun.BtYp9v.mount: Deactivated successfully. Feb 23 04:56:47 localhost podman[312869]: 2026-02-23 09:56:47.028337287 +0000 UTC m=+0.107805441 container cleanup 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:47 localhost systemd[1]: libpod-conmon-2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1.scope: Deactivated successfully. Feb 23 04:56:47 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:47.101 265541 INFO neutron.agent.dhcp.agent [None req-3fff2107-a8bf-49c6-a1da-5c55c8e1ef8b - - - - - -] DHCP configuration for ports {'7a454d8b-c57f-41ac-a163-5c4d28169a90'} is completed#033[00m Feb 23 04:56:47 localhost podman[312871]: 2026-02-23 09:56:47.11616207 +0000 UTC m=+0.183437118 container remove 2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b207e42e-4d3c-43ce-b855-2d1a36797be6, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:56:47 localhost nova_compute[282206]: 2026-02-23 09:56:47.164 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:47 localhost kernel: device tap5e31c1f9-f2 left promiscuous mode Feb 23 04:56:47 localhost ovn_controller[157695]: 2026-02-23T09:56:47Z|00190|binding|INFO|Releasing lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a from this chassis (sb_readonly=0) Feb 23 04:56:47 localhost ovn_controller[157695]: 2026-02-23T09:56:47Z|00191|binding|INFO|Setting lport 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a down in Southbound Feb 23 04:56:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:47.173 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b207e42e-4d3c-43ce-b855-2d1a36797be6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b207e42e-4d3c-43ce-b855-2d1a36797be6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5c50f2ae0a2f4cdd8225f6794547909b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0b7f1d9-1471-4000-b583-343082500ed7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:47.175 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 5e31c1f9-f29e-43a3-8e7c-2e3adb446c2a in datapath b207e42e-4d3c-43ce-b855-2d1a36797be6 unbound from our chassis#033[00m Feb 23 04:56:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:47.177 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network b207e42e-4d3c-43ce-b855-2d1a36797be6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:47.178 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[fdc75a83-2689-4bd8-8c87-e859bdd4583b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:47 localhost nova_compute[282206]: 2026-02-23 09:56:47.185 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:47 localhost nova_compute[282206]: 2026-02-23 09:56:47.187 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:47 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:47.212 265541 INFO neutron.agent.dhcp.agent [None req-e7b3f8fc-1587-45d7-bebb-80552091a52c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:47 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:47.438 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:47 localhost ovn_controller[157695]: 2026-02-23T09:56:47Z|00192|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:47 localhost nova_compute[282206]: 2026-02-23 09:56:47.631 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:47 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:47.754 2 INFO neutron.agent.securitygroups_rpc [None req-531eb3a8-b1e9-4d08-88ed-f2a5323c2530 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:47 localhost systemd[1]: var-lib-containers-storage-overlay-85d2c0758da427144c001a7e97c7359bf4b87ae2a86e80ab66c7f916f65db929-merged.mount: Deactivated successfully. Feb 23 04:56:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2532b22c884299358d19e096c9cf0cdee2dd0b81d7b698e52ccc67ba00649cf1-userdata-shm.mount: Deactivated successfully. Feb 23 04:56:47 localhost systemd[1]: run-netns-qdhcp\x2db207e42e\x2d4d3c\x2d43ce\x2db855\x2d2d1a36797be6.mount: Deactivated successfully. Feb 23 04:56:47 localhost nova_compute[282206]: 2026-02-23 09:56:47.923 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:48 localhost dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 1 addresses Feb 23 04:56:48 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host Feb 23 04:56:48 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts Feb 23 04:56:48 localhost podman[312917]: 2026-02-23 09:56:48.038400924 +0000 UTC m=+0.067758656 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0) Feb 23 04:56:48 localhost systemd[1]: tmp-crun.epo7Gj.mount: Deactivated successfully. Feb 23 04:56:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:48.557 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:56:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:56:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:56:49 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:49.054 2 INFO neutron.agent.securitygroups_rpc [None req-7c4570b4-c3dc-480d-b91f-3f4320c93168 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:49 localhost dnsmasq[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/addn_hosts - 0 addresses Feb 23 04:56:49 localhost podman[312952]: 2026-02-23 09:56:49.379993272 +0000 UTC m=+0.059415017 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:49 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/host Feb 23 04:56:49 localhost dnsmasq-dhcp[312766]: read /var/lib/neutron/dhcp/6d7119b8-7a22-4328-9884-4df90f2c3ebd/opts Feb 23 04:56:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:49.883 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:49 localhost nova_compute[282206]: 2026-02-23 09:56:49.884 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:49.884 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:56:49 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:49.980 2 INFO neutron.agent.securitygroups_rpc [None req-df38b538-2f68-43d4-a4e5-1e14d933a2a9 a882fa93577048b68025b6e97dbb9195 e8630a66fd9f41828b0bd2cf93b5956f - - default default] Security group member updated ['ea9a997e-7b09-4599-8d8f-c6dc5472496e']#033[00m Feb 23 04:56:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:50.359 265541 INFO neutron.agent.linux.ip_lib [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Device tap6b8e941b-e3 cannot be used as it has no MAC address#033[00m Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.383 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost kernel: device tap6b8e941b-e3 entered promiscuous mode Feb 23 04:56:50 localhost NetworkManager[5974]: [1771840610.3920] manager: (tap6b8e941b-e3): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Feb 23 04:56:50 localhost ovn_controller[157695]: 2026-02-23T09:56:50Z|00193|binding|INFO|Claiming lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 for this chassis. Feb 23 04:56:50 localhost ovn_controller[157695]: 2026-02-23T09:56:50Z|00194|binding|INFO|6b8e941b-e318-43c8-8da1-efc8c08d0ac8: Claiming unknown Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.392 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost systemd-udevd[313011]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.407 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2fe91a2a-5b02-4767-89ca-7f8954141d90', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fe91a2a-5b02-4767-89ca-7f8954141d90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8630a66fd9f41828b0bd2cf93b5956f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c6d3781-7aae-4474-bf2c-0e950a13f37c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6b8e941b-e318-43c8-8da1-efc8c08d0ac8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.408 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 in datapath 2fe91a2a-5b02-4767-89ca-7f8954141d90 bound to our chassis#033[00m Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.410 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2fe91a2a-5b02-4767-89ca-7f8954141d90 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:50 localhost dnsmasq[312766]: exiting on receipt of SIGTERM Feb 23 04:56:50 localhost podman[312993]: 2026-02-23 09:56:50.411363666 +0000 UTC m=+0.075830173 container kill 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.411 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[2001e787-29d8-408f-9ccf-9678ed1fb6cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:50 localhost systemd[1]: libpod-4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd.scope: Deactivated successfully. Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.423 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost ovn_controller[157695]: 2026-02-23T09:56:50Z|00195|binding|INFO|Setting lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 ovn-installed in OVS Feb 23 04:56:50 localhost ovn_controller[157695]: 2026-02-23T09:56:50Z|00196|binding|INFO|Setting lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 up in Southbound Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.443 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.446 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.492 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost podman[313015]: 2026-02-23 09:56:50.497749676 +0000 UTC m=+0.066608490 container died 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.519 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost podman[313015]: 2026-02-23 09:56:50.52962343 +0000 UTC m=+0.098482184 container cleanup 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_managed=true) Feb 23 04:56:50 localhost systemd[1]: libpod-conmon-4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd.scope: Deactivated successfully. Feb 23 04:56:50 localhost ovn_controller[157695]: 2026-02-23T09:56:50Z|00197|binding|INFO|Removing iface tap556381c3-ac ovn-installed in OVS Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.546 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 5c3fc2f9-1f90-4ed3-b1c0-73af4bb6bf5f with type ""#033[00m Feb 23 04:56:50 localhost ovn_controller[157695]: 2026-02-23T09:56:50Z|00198|binding|INFO|Removing lport 556381c3-ace4-4128-82b0-c33c8ad0e1c9 ovn-installed in OVS Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.548 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-6d7119b8-7a22-4328-9884-4df90f2c3ebd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d7119b8-7a22-4328-9884-4df90f2c3ebd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0aaf8a49-34b1-4108-a38e-a49be6b7ace1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=556381c3-ace4-4128-82b0-c33c8ad0e1c9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.548 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.550 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 556381c3-ace4-4128-82b0-c33c8ad0e1c9 in datapath 6d7119b8-7a22-4328-9884-4df90f2c3ebd unbound from our chassis#033[00m Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.553 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d7119b8-7a22-4328-9884-4df90f2c3ebd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:56:50 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:50.553 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f1f984-ad83-49ab-8a46-57f63cccd672]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.554 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost podman[313017]: 2026-02-23 09:56:50.57302823 +0000 UTC m=+0.126369405 container remove 4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d7119b8-7a22-4328-9884-4df90f2c3ebd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0) Feb 23 04:56:50 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:50.589 2 INFO neutron.agent.securitygroups_rpc [None req-8eb799ea-d360-4da4-8f9a-9902d730dcc7 a882fa93577048b68025b6e97dbb9195 e8630a66fd9f41828b0bd2cf93b5956f - - default default] Security group member updated ['ea9a997e-7b09-4599-8d8f-c6dc5472496e']#033[00m Feb 23 04:56:50 localhost kernel: device tap556381c3-ac left promiscuous mode Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.625 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.637 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:50.650 265541 INFO neutron.agent.dhcp.agent [None req-30f6c4db-23e3-4788-8b95-e85b5b625129 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:50.769 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:50 localhost ovn_controller[157695]: 2026-02-23T09:56:50Z|00199|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:50 localhost nova_compute[282206]: 2026-02-23 09:56:50.959 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:51 localhost podman[313096]: Feb 23 04:56:51 localhost podman[313096]: 2026-02-23 09:56:51.368992963 +0000 UTC m=+0.087450054 container create a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:56:51 localhost systemd[1]: Started libpod-conmon-a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1.scope. Feb 23 04:56:51 localhost systemd[1]: var-lib-containers-storage-overlay-b266334acb0e55f0426e207375e16e3c1c5eaa3a9f7c324de8884ec556d7b30d-merged.mount: Deactivated successfully. Feb 23 04:56:51 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c97b74bb0c1ccb48aaab8f53a54e5bed89d083ddf83f7899a80a394fe24eefd-userdata-shm.mount: Deactivated successfully. Feb 23 04:56:51 localhost systemd[1]: run-netns-qdhcp\x2d6d7119b8\x2d7a22\x2d4328\x2d9884\x2d4df90f2c3ebd.mount: Deactivated successfully. Feb 23 04:56:51 localhost podman[313096]: 2026-02-23 09:56:51.325815168 +0000 UTC m=+0.044272269 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:56:51 localhost systemd[1]: Started libcrun container. Feb 23 04:56:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb5a3a963be7354e78652f4a92c2c34304d37c406133dfb79b7eec719c4f26a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:56:51 localhost podman[313096]: 2026-02-23 09:56:51.452029258 +0000 UTC m=+0.170486339 container init a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:56:51 localhost podman[313096]: 2026-02-23 09:56:51.462152311 +0000 UTC m=+0.180609402 container start a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 23 04:56:51 localhost dnsmasq[313115]: started, version 2.85 cachesize 150 Feb 23 04:56:51 localhost dnsmasq[313115]: DNS service limited to local subnets Feb 23 04:56:51 localhost dnsmasq[313115]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:56:51 localhost dnsmasq[313115]: warning: no upstream servers configured Feb 23 04:56:51 localhost dnsmasq-dhcp[313115]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:56:51 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 0 addresses Feb 23 04:56:51 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host Feb 23 04:56:51 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts Feb 23 04:56:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.524 265541 INFO neutron.agent.dhcp.agent [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:49Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4f99192a-5334-414e-8f15-7eeaaea3cb3b, ip_allocation=immediate, mac_address=fa:16:3e:93:f2:6b, name=tempest-ExtraDHCPOptionsIpV6TestJSON-44067334, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:47Z, description=, dns_domain=, id=2fe91a2a-5b02-4767-89ca-7f8954141d90, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-581439413, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18315, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1450, status=ACTIVE, subnets=['ffdf14ca-f105-4a70-9464-7924b5a5f427'], tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:49Z, vlan_transparent=None, network_id=2fe91a2a-5b02-4767-89ca-7f8954141d90, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ea9a997e-7b09-4599-8d8f-c6dc5472496e'], standard_attr_id=1459, status=DOWN, tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:49Z on network 2fe91a2a-5b02-4767-89ca-7f8954141d90#033[00m Feb 23 04:56:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.597 265541 INFO neutron.agent.dhcp.agent [None req-11ce90fe-c87a-4973-91a4-46f144b76ce1 - - - - - -] DHCP configuration for ports {'21e9c9a4-2bce-498a-92d2-ed020111f0ed'} is completed#033[00m Feb 23 04:56:51 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 1 addresses Feb 23 04:56:51 localhost podman[313135]: 2026-02-23 09:56:51.719602265 +0000 UTC m=+0.062743920 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS) Feb 23 04:56:51 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host Feb 23 04:56:51 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts Feb 23 04:56:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.865 265541 INFO neutron.agent.dhcp.agent [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:50Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=d84e91b2-8278-4035-a47c-6d8d258897cd, ip_allocation=immediate, mac_address=fa:16:3e:15:53:ce, name=tempest-ExtraDHCPOptionsIpV6TestJSON-7708552, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:56:47Z, description=, dns_domain=, id=2fe91a2a-5b02-4767-89ca-7f8954141d90, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-581439413, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18315, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1450, status=ACTIVE, subnets=['ffdf14ca-f105-4a70-9464-7924b5a5f427'], tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:49Z, vlan_transparent=None, network_id=2fe91a2a-5b02-4767-89ca-7f8954141d90, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['ea9a997e-7b09-4599-8d8f-c6dc5472496e'], standard_attr_id=1460, status=DOWN, tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:50Z on network 2fe91a2a-5b02-4767-89ca-7f8954141d90#033[00m Feb 23 04:56:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.882 265541 INFO neutron.agent.linux.dhcp [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Feb 23 04:56:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.883 265541 INFO neutron.agent.linux.dhcp [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Feb 23 04:56:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.884 265541 INFO neutron.agent.linux.dhcp [None req-7d76da0b-577a-4df2-a4d3-a0ad80518032 - - - - - -] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Feb 23 04:56:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:51.886 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:56:51 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:51.917 2 INFO neutron.agent.securitygroups_rpc [None req-b1440841-9f18-4923-88ab-a2d6a50a7349 a882fa93577048b68025b6e97dbb9195 e8630a66fd9f41828b0bd2cf93b5956f - - default default] Security group member updated ['ea9a997e-7b09-4599-8d8f-c6dc5472496e']#033[00m Feb 23 04:56:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:51.948 265541 INFO neutron.agent.dhcp.agent [None req-d0f0bbc0-c431-4844-b3ae-9fea310134a7 - - - - - -] DHCP configuration for ports {'4f99192a-5334-414e-8f15-7eeaaea3cb3b'} is completed#033[00m Feb 23 04:56:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:52 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 2 addresses Feb 23 04:56:52 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host Feb 23 04:56:52 localhost podman[313173]: 2026-02-23 09:56:52.061332713 +0000 UTC m=+0.058572581 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:56:52 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts Feb 23 04:56:52 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:52.271 265541 INFO neutron.agent.dhcp.agent [None req-ef822ec0-1efe-424b-b1c8-9db8cf35b866 - - - - - -] DHCP configuration for ports {'d84e91b2-8278-4035-a47c-6d8d258897cd'} is completed#033[00m Feb 23 04:56:52 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 1 addresses Feb 23 04:56:52 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host Feb 23 04:56:52 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts Feb 23 04:56:52 localhost podman[313210]: 2026-02-23 09:56:52.423471971 +0000 UTC m=+0.060187711 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:56:52 localhost nova_compute[282206]: 2026-02-23 09:56:52.926 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.114 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:56:49Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=4f99192a-5334-414e-8f15-7eeaaea3cb3b, ip_allocation=immediate, mac_address=fa:16:3e:93:f2:6b, name=tempest-new-port-name-2068423017, network_id=2fe91a2a-5b02-4767-89ca-7f8954141d90, port_security_enabled=True, project_id=e8630a66fd9f41828b0bd2cf93b5956f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['ea9a997e-7b09-4599-8d8f-c6dc5472496e'], standard_attr_id=1459, status=DOWN, tags=[], tenant_id=e8630a66fd9f41828b0bd2cf93b5956f, updated_at=2026-02-23T09:56:52Z on network 2fe91a2a-5b02-4767-89ca-7f8954141d90#033[00m Feb 23 04:56:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.130 265541 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions#033[00m Feb 23 04:56:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.131 265541 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions#033[00m Feb 23 04:56:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.131 265541 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions#033[00m Feb 23 04:56:53 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 1 addresses Feb 23 04:56:53 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host Feb 23 04:56:53 localhost podman[313250]: 2026-02-23 09:56:53.304668696 +0000 UTC m=+0.060684876 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:56:53 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts Feb 23 04:56:53 localhost systemd[1]: tmp-crun.qhvDwc.mount: Deactivated successfully. Feb 23 04:56:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:53.525 265541 INFO neutron.agent.dhcp.agent [None req-234108c3-b184-4e9a-ae6d-ad34f2539da2 - - - - - -] DHCP configuration for ports {'4f99192a-5334-414e-8f15-7eeaaea3cb3b'} is completed#033[00m Feb 23 04:56:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:56:53 localhost systemd[1]: tmp-crun.kj9Ul1.mount: Deactivated successfully. Feb 23 04:56:53 localhost podman[313272]: 2026-02-23 09:56:53.909895825 +0000 UTC m=+0.086809513 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:56:53 localhost podman[313272]: 2026-02-23 09:56:53.920224603 +0000 UTC m=+0.097138271 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 04:56:53 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:56:54 localhost nova_compute[282206]: 2026-02-23 09:56:54.217 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:55 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:55.381 2 INFO neutron.agent.securitygroups_rpc [None req-7f69dcdd-1ed9-444c-bbd4-b59ea7457a69 a882fa93577048b68025b6e97dbb9195 e8630a66fd9f41828b0bd2cf93b5956f - - default default] Security group member updated ['ea9a997e-7b09-4599-8d8f-c6dc5472496e']#033[00m Feb 23 04:56:55 localhost podman[313312]: 2026-02-23 09:56:55.602755826 +0000 UTC m=+0.061171561 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:56:55 localhost dnsmasq[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/addn_hosts - 0 addresses Feb 23 04:56:55 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/host Feb 23 04:56:55 localhost dnsmasq-dhcp[313115]: read /var/lib/neutron/dhcp/2fe91a2a-5b02-4767-89ca-7f8954141d90/opts Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.142 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.147 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '182a5f29-179f-43ec-b134-057a9b3fac25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.143353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcbf0cdc-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '6194f0052505030bd3519d93021853ba22b0b090be6798160ace84ce9d8700a2'}]}, 'timestamp': '2026-02-23 09:56:56.148818', '_unique_id': '5a6c332814c944f08eb2473c599ae688'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.150 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.181 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.181 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1b78942-3f92-4b55-94d1-fab77d176ba4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.151945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc416b4-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '8c142561203ec2b5fec368456ac2f750b06efca8d70e92d20e5c12128dc8a49d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.151945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc42f32-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '299ede49d855797d836b24e6a7a116e24df3e569c267cae52b84c1cc6ffcb855'}]}, 'timestamp': '2026-02-23 09:56:56.182391', '_unique_id': '24182d04d8634dceba034eb1237fe821'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.183 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.194 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.194 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61af4262-028f-40ca-be6b-688bc93c2fb3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.185214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc60ece-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': 'b607776cc9865ecb4916baab60e1772f4460e66568bf88e174753882e925d928'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.185214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc6217a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': '55aa50982b54a718da044b05df746454cd7625d466d627161612536571186656'}]}, 'timestamp': '2026-02-23 09:56:56.195133', '_unique_id': '251d2fa854d4417dbc8f19b163fb8e67'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.197 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.197 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88ba2dd9-4868-4ef7-b497-76e2fec1936b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.197972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc6a424-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': 'ee7fd69b86bc7694fb5f8cec63c9fa3b047447938ef0c4a1793cf0bdd064694f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.197972', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc6b554-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': '38695524ae6a3da7519614866bd3233b01fdd105a9b278904b41ade224ca9d8d'}]}, 'timestamp': '2026-02-23 09:56:56.198930', '_unique_id': '35e18140c057497d94a7cc6191560aba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.199 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a055a7d7-4096-4049-9960-4ccf233328b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.201568', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcc73600-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '23f0088a78569f6a1e54853b5e39d6003731fa0db7cd8e473ccfa173b223fd21'}]}, 'timestamp': '2026-02-23 09:56:56.202361', '_unique_id': '5837282362cf4d1bad08183a1f1849b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.205 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb10808a-6306-4a91-8713-1a3cb86bf505', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.205586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc7c87c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '6babd20c1e088b50735303df2cbc9219990b19a333379d8fac875316b3b762a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.205586', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc7d4b6-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '96cb1faa2cb55abd599cb6acc85559f04e3b0e935899c22be1e7071f74aacd5c'}]}, 'timestamp': '2026-02-23 09:56:56.206194', '_unique_id': 'f01b0785c0314633acfb3f2bcd20ee58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e691f2d7-88db-435b-91e3-72a62913a6ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.207947', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcc824ca-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '8ea2b04db810df805269e6cd833e364a288d475993ef15ecf6e1c07907a43f7f'}]}, 'timestamp': '2026-02-23 09:56:56.208263', '_unique_id': 'e4f34c3c52fd461cbe36b73109f3476b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.209 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a077978-79a7-47b8-a8b7-56b7d019bf94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.209694', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcc868d6-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': '64371352c826cba8e0707c00c36567a52eb7f983ba30b98690f26186db2542aa'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.209694', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcc8756a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.374714634, 'message_signature': 'cf40f1b50719a6a354cef93960760b7b0d366e99fdaa5ff07f70d74481f65756'}]}, 'timestamp': '2026-02-23 09:56:56.210311', '_unique_id': '9f9d378e2a2e4f158c4cb9dcb33693b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.210 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.212 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '632651f9-5f27-4ccc-b991-85b912c06212', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.212223', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcc8ce98-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '1f50af63fbbebec0ae311e9a7c1d40f34400c9a5f883ebccd56541f263857d0b'}]}, 'timestamp': '2026-02-23 09:56:56.212614', '_unique_id': '83bb708883f34e889df3d297ee222d77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.213 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.214 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e5269a2-9a9e-4598-85c3-71c96e48621b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:56:56.214110', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fccb94c0-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.419708015, 'message_signature': 'ed77c9f4a29b5952c95b0ccd98badfdd5a09bce4e42a1b4998f1c0693601c9aa'}]}, 'timestamp': '2026-02-23 09:56:56.230785', '_unique_id': '9e7d78e5b21b4be6990d635dd9e6876e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 13700000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b6f5c72-fe4f-48cc-973c-7ffb24cf45ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13700000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:56:56.232342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'fccbdc0a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.419708015, 'message_signature': '77e25bc5a17c50048a2571569f9c094afd818d1c62ea111872812fefe9c2acc7'}]}, 'timestamp': '2026-02-23 09:56:56.232549', '_unique_id': '06c2a898dec0488fa23ff2506741f42a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1edb2890-7ebc-4b49-a7aa-42ba1822fd6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.233500', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccc095a-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': 'cefa7aacf0009fad796f984e5abcbb3fd03a4257f75c7d9ea63e57a704e69800'}]}, 'timestamp': '2026-02-23 09:56:56.233714', '_unique_id': 'd8e8179a5c6b46ae871bc82dd8479bed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '00bd1090-0aa6-430b-a47a-c5fe5714dc59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.234760', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fccc3a56-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '3e2c3326a8927a4f233a899e4634e8893e3fd241045555486a85a3fa44cbcc73'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.234760', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fccc4276-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': 'ac5986d178ec894a3bbd764ccf4378c116a036c3424ef7feab5d48a00af6711b'}]}, 'timestamp': '2026-02-23 09:56:56.235162', '_unique_id': 'a1267f309e57426eb5121827c4e3c4a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59e39053-beb3-43ad-a797-f41f56bf7f73', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.236129', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccc6fda-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '047a2796f43520a1ba449f7fbe072f0ee9365138d71d429734719079ba1136c0'}]}, 'timestamp': '2026-02-23 09:56:56.236337', '_unique_id': 'a615d1f5095743e98c2c8c30b8ea9651'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '410da362-33ab-4de3-8359-f0269045e4f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.237288', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccc9d16-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': 'd856f632dac75183d63cf6a266c33ef7ab9d0822e985cbd43a04e6587a99707d'}]}, 'timestamp': '2026-02-23 09:56:56.237496', '_unique_id': '968648cdd6154eca813e82812cc224c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55223e62-4424-4f26-b20c-2b28d73f19db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.238452', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fcccca98-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '8c292fbc0cc199012fb461b2a8a26f5192eb613694006581b008420442a5f88f'}]}, 'timestamp': '2026-02-23 09:56:56.238661', '_unique_id': '55a5555339de4969a637ce59694fbf0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.239 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f1b036-8927-4d72-bdf9-4b27aa9ea2ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.239601', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fcccf75c-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': 'cd7c6fa981d4acc3b795b3f57fd737cb5ca10fcf81fe557ecec16f5017b31fae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.239601', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fcccff18-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '7444efe10927925292e19018d94d342456484dfe728649588ff0dbfce84feb70'}]}, 'timestamp': '2026-02-23 09:56:56.239993', '_unique_id': '70a29b9903554df2ab004c29aa4b4acc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '112f18bd-2243-4656-97a8-e71d73a58571', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.240955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fccd2c5e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': 'f62585010c67e6e3cc6b9311d5b2b0b5edfa3ec7efb773a269f55df182ca4b95'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.240955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fccd3384-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '7cafdff3fce54c33429659dd61013c32a2e5bd9ffd514ba2a3af8e93a7ea52a4'}]}, 'timestamp': '2026-02-23 09:56:56.241333', '_unique_id': '001868957c5846a9a72f90e678cdd0e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94d3ebad-6450-495e-bc2e-4c09962c479d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.242309', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccd6142-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '0515c2e51e4b619907699068af492a9bfb0f5cc49ae8ba7b35897c32443ee902'}]}, 'timestamp': '2026-02-23 09:56:56.242517', '_unique_id': '6ab435671e4a40e1aa19cbc4e27d0605'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e730e557-7738-4936-a7ac-5c9f76772ab5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:56:56.243447', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'fccd8dac-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.332842681, 'message_signature': '0b2124eb8fc21bffa71622127f5f0f55dedf757b810426415b676441a0cd176d'}]}, 'timestamp': '2026-02-23 09:56:56.243654', '_unique_id': 'b8dcfb95008f4ef4837af2f90408b2a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42f90fab-d961-46e5-a769-daaf64b110f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:56:56.244768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fccdc1d2-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': '2421ac46e7a1eeaf960243dd59bc72cd403ed406a5ac80c232ca083007784916'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:56:56.244768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fccdca2e-109d-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12056.341605851, 'message_signature': 'cbdbee54cd02b7762607bec45c931d366d12e45775e123576a299c85264e1091'}]}, 'timestamp': '2026-02-23 09:56:56.245190', '_unique_id': '37bc28ea8c474e9ab38487d0dac7d098'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:56:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:56:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 04:56:56 localhost dnsmasq[313115]: exiting on receipt of SIGTERM Feb 23 04:56:56 localhost podman[313349]: 2026-02-23 09:56:56.616563589 +0000 UTC m=+0.061751509 container kill a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:56:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:56:56 localhost systemd[1]: libpod-a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1.scope: Deactivated successfully. Feb 23 04:56:56 localhost podman[313362]: 2026-02-23 09:56:56.682451984 +0000 UTC m=+0.053857535 container died a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0) Feb 23 04:56:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1-userdata-shm.mount: Deactivated successfully. Feb 23 04:56:56 localhost podman[313362]: 2026-02-23 09:56:56.718031623 +0000 UTC m=+0.089437094 container cleanup a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:56:56 localhost systemd[1]: libpod-conmon-a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1.scope: Deactivated successfully. Feb 23 04:56:56 localhost podman[313364]: 2026-02-23 09:56:56.739218598 +0000 UTC m=+0.097642987 container remove a96b696892d823e8b0bba4e645f2a1b2d19750d84c810551e0c6042fbbc568c1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2fe91a2a-5b02-4767-89ca-7f8954141d90, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 23 04:56:56 localhost kernel: device tap6b8e941b-e3 left promiscuous mode Feb 23 04:56:56 localhost ovn_controller[157695]: 2026-02-23T09:56:56Z|00200|binding|INFO|Releasing lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 from this chassis (sb_readonly=0) Feb 23 04:56:56 localhost ovn_controller[157695]: 2026-02-23T09:56:56Z|00201|binding|INFO|Setting lport 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 down in Southbound Feb 23 04:56:56 localhost nova_compute[282206]: 2026-02-23 09:56:56.776 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:56 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:56.782 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-2fe91a2a-5b02-4767-89ca-7f8954141d90', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2fe91a2a-5b02-4767-89ca-7f8954141d90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e8630a66fd9f41828b0bd2cf93b5956f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4c6d3781-7aae-4474-bf2c-0e950a13f37c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6b8e941b-e318-43c8-8da1-efc8c08d0ac8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:56:56 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:56.784 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6b8e941b-e318-43c8-8da1-efc8c08d0ac8 in datapath 2fe91a2a-5b02-4767-89ca-7f8954141d90 unbound from our chassis#033[00m Feb 23 04:56:56 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:56.785 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2fe91a2a-5b02-4767-89ca-7f8954141d90 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:56:56 localhost ovn_metadata_agent[163567]: 2026-02-23 09:56:56.785 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[a8393d00-fe05-4f60-a0dc-40681a74f55d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:56:56 localhost podman[313365]: 2026-02-23 09:56:56.786022864 +0000 UTC m=+0.149073307 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, release=1770267347, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.openshift.expose-services=, io.buildah.version=1.33.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Feb 23 04:56:56 localhost nova_compute[282206]: 2026-02-23 09:56:56.800 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:56 localhost podman[313365]: 2026-02-23 09:56:56.825520114 +0000 UTC m=+0.188570507 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container) Feb 23 04:56:56 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:56:56 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:56.865 265541 INFO neutron.agent.dhcp.agent [None req-abb12fa1-32ad-49de-b347-2217f17a4166 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:56:57 localhost systemd[1]: var-lib-containers-storage-overlay-bb5a3a963be7354e78652f4a92c2c34304d37c406133dfb79b7eec719c4f26a2-merged.mount: Deactivated successfully. Feb 23 04:56:57 localhost systemd[1]: run-netns-qdhcp\x2d2fe91a2a\x2d5b02\x2d4767\x2d89ca\x2d7f8954141d90.mount: Deactivated successfully. Feb 23 04:56:57 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:56:57.823 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:56:57 localhost nova_compute[282206]: 2026-02-23 09:56:57.964 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:58 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:58.199 2 INFO neutron.agent.securitygroups_rpc [None req-527daa95-1754-43ce-9cc1-4adbaa0b338f d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:56:58 localhost neutron_sriov_agent[258207]: 2026-02-23 09:56:58.216 2 INFO neutron.agent.securitygroups_rpc [None req-7c0e711d-1075-4bc5-843f-4c1a8d666461 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:56:58 localhost ovn_controller[157695]: 2026-02-23T09:56:58Z|00202|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:56:58 localhost nova_compute[282206]: 2026-02-23 09:56:58.725 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:56:58 localhost nova_compute[282206]: 2026-02-23 09:56:58.871 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:00 localhost ovn_controller[157695]: 2026-02-23T09:57:00Z|00203|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:57:00 localhost nova_compute[282206]: 2026-02-23 09:57:00.413 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:01 localhost ovn_controller[157695]: 2026-02-23T09:57:01Z|00204|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:57:01 localhost nova_compute[282206]: 2026-02-23 09:57:01.132 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:01 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:01.837 2 INFO neutron.agent.securitygroups_rpc [None req-7a651b51-72a3-44e2-bd5a-07e4363f8263 3247f4a1ec054de78018b025a6933ab5 13ab81953d004010a22a72d978d31c4d - - default default] Security group member updated ['b31260d7-60e9-40f6-abcc-b3b02fd41e3c']#033[00m Feb 23 04:57:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:02 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:02.041 2 INFO neutron.agent.securitygroups_rpc [None req-861b86c2-eda9-4017-bb33-3609a7ddd89c d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:57:02 localhost nova_compute[282206]: 2026-02-23 09:57:02.967 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:05 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:05.231 265541 INFO neutron.agent.linux.ip_lib [None req-2f446502-99ed-410f-a2d7-3fc6d870c5d9 - - - - - -] Device tap73a840c6-4a cannot be used as it has no MAC address#033[00m Feb 23 04:57:05 localhost nova_compute[282206]: 2026-02-23 09:57:05.250 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:05 localhost kernel: device tap73a840c6-4a entered promiscuous mode Feb 23 04:57:05 localhost NetworkManager[5974]: [1771840625.2577] manager: (tap73a840c6-4a): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Feb 23 04:57:05 localhost nova_compute[282206]: 2026-02-23 09:57:05.258 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:05 localhost ovn_controller[157695]: 2026-02-23T09:57:05Z|00205|binding|INFO|Claiming lport 73a840c6-4a09-4a70-b6d2-862b377cac9d for this chassis. Feb 23 04:57:05 localhost ovn_controller[157695]: 2026-02-23T09:57:05Z|00206|binding|INFO|73a840c6-4a09-4a70-b6d2-862b377cac9d: Claiming unknown Feb 23 04:57:05 localhost systemd-udevd[313421]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:57:05 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:05.270 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5c327632-3675-4ada-bcc8-d5fb15ecb5d7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c327632-3675-4ada-bcc8-d5fb15ecb5d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aef873a00904cab867a4692ec3a78cb', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ce95503-eca6-4225-b8ee-cabdbb8ec29c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=73a840c6-4a09-4a70-b6d2-862b377cac9d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:05 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:05.278 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 73a840c6-4a09-4a70-b6d2-862b377cac9d in datapath 5c327632-3675-4ada-bcc8-d5fb15ecb5d7 bound to our chassis#033[00m Feb 23 04:57:05 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:05.280 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:57:05 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:05.281 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ccee22-37fe-4862-b839-b03c4c61d631]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:05 localhost journal[231253]: ethtool ioctl error on tap73a840c6-4a: No such device Feb 23 04:57:05 localhost ovn_controller[157695]: 2026-02-23T09:57:05Z|00207|binding|INFO|Setting lport 73a840c6-4a09-4a70-b6d2-862b377cac9d ovn-installed in OVS Feb 23 04:57:05 localhost ovn_controller[157695]: 2026-02-23T09:57:05Z|00208|binding|INFO|Setting lport 73a840c6-4a09-4a70-b6d2-862b377cac9d up in Southbound Feb 23 04:57:05 localhost nova_compute[282206]: 2026-02-23 09:57:05.294 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:05 localhost nova_compute[282206]: 2026-02-23 09:57:05.298 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:05 localhost nova_compute[282206]: 2026-02-23 09:57:05.300 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:05 localhost journal[231253]: ethtool ioctl error on tap73a840c6-4a: No such device Feb 23 04:57:05 localhost journal[231253]: ethtool ioctl error on tap73a840c6-4a: No such device Feb 23 04:57:05 localhost journal[231253]: ethtool ioctl error on tap73a840c6-4a: No such device Feb 23 04:57:05 localhost journal[231253]: ethtool ioctl error on tap73a840c6-4a: No such device Feb 23 04:57:05 localhost journal[231253]: ethtool ioctl error on tap73a840c6-4a: No such device Feb 23 04:57:05 localhost journal[231253]: ethtool ioctl error on tap73a840c6-4a: No such device Feb 23 04:57:05 localhost journal[231253]: ethtool ioctl error on tap73a840c6-4a: No such device Feb 23 04:57:05 localhost nova_compute[282206]: 2026-02-23 09:57:05.340 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:05 localhost nova_compute[282206]: 2026-02-23 09:57:05.369 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:05 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:05.519 2 INFO neutron.agent.securitygroups_rpc [None req-12c23e79-c442-4ba1-a515-53d2f774a574 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:57:06 localhost podman[313492]: Feb 23 04:57:06 localhost podman[313492]: 2026-02-23 09:57:06.214309464 +0000 UTC m=+0.087610128 container create 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:57:06 localhost systemd[1]: Started libpod-conmon-01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882.scope. Feb 23 04:57:06 localhost systemd[1]: Started libcrun container. Feb 23 04:57:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/658cf474e63c73733b33588ec74e8a0b39a8eb6542d46aec9fd53fcc5cc3218f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:57:06 localhost podman[313492]: 2026-02-23 09:57:06.182236752 +0000 UTC m=+0.055537416 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:57:06 localhost podman[313492]: 2026-02-23 09:57:06.28347073 +0000 UTC m=+0.156771384 container init 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 04:57:06 localhost podman[313492]: 2026-02-23 09:57:06.292057096 +0000 UTC m=+0.165357750 container start 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 23 04:57:06 localhost dnsmasq[313510]: started, version 2.85 cachesize 150 Feb 23 04:57:06 localhost dnsmasq[313510]: DNS service limited to local subnets Feb 23 04:57:06 localhost dnsmasq[313510]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:57:06 localhost dnsmasq[313510]: warning: no upstream servers configured Feb 23 04:57:06 localhost dnsmasq-dhcp[313510]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:57:06 localhost dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 0 addresses Feb 23 04:57:06 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host Feb 23 04:57:06 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts Feb 23 04:57:06 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:06.454 265541 INFO neutron.agent.dhcp.agent [None req-aaa66832-cd7b-4b68-b1e2-5ff8487e1f57 - - - - - -] DHCP configuration for ports {'237004ec-c4e6-4656-8fdf-edc428916b84'} is completed#033[00m Feb 23 04:57:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:57:07 localhost podman[313511]: 2026-02-23 09:57:07.155115381 +0000 UTC m=+0.078806287 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:57:07 localhost podman[313511]: 2026-02-23 09:57:07.171189717 +0000 UTC m=+0.094880613 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:57:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:57:07 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:57:07 localhost podman[313532]: 2026-02-23 09:57:07.24183849 +0000 UTC m=+0.056355233 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller) Feb 23 04:57:07 localhost podman[313532]: 2026-02-23 09:57:07.326283899 +0000 UTC m=+0.140800682 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 04:57:07 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:57:07 localhost nova_compute[282206]: 2026-02-23 09:57:07.970 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:08 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:08.919 2 INFO neutron.agent.securitygroups_rpc [None req-0e398338-d100-41a6-9496-902001e5b466 9a9c222b96714eb4b3e886d05c8c4dce 8aef873a00904cab867a4692ec3a78cb - - default default] Security group member updated ['58404006-bc14-42d2-ad0c-1fbb19168177']#033[00m Feb 23 04:57:08 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:08.971 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:08Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=8cb7a230-2eec-4e99-ae4f-fdf93493284b, ip_allocation=immediate, mac_address=fa:16:3e:fc:9e:c0, name=tempest-ExtraDHCPOptionsTestJSON-1719479753, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:02Z, description=, dns_domain=, id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-2021646158, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18544, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1542, status=ACTIVE, subnets=['357cab67-d9ec-4b7b-abb7-813d68b84f46'], tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:03Z, vlan_transparent=None, network_id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['58404006-bc14-42d2-ad0c-1fbb19168177'], standard_attr_id=1576, status=DOWN, tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:08Z on network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7#033[00m Feb 23 04:57:09 localhost dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 1 addresses Feb 23 04:57:09 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host Feb 23 04:57:09 localhost podman[313574]: 2026-02-23 09:57:09.219017165 +0000 UTC m=+0.060590913 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:09 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts Feb 23 04:57:09 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:09.306 2 INFO neutron.agent.securitygroups_rpc [None req-bf52f67a-8b52-4e6b-9a4f-c7fa470a3b71 3247f4a1ec054de78018b025a6933ab5 13ab81953d004010a22a72d978d31c4d - - default default] Security group member updated ['b31260d7-60e9-40f6-abcc-b3b02fd41e3c']#033[00m Feb 23 04:57:09 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:09.360 265541 INFO neutron.agent.linux.ip_lib [None req-172401bf-454c-4035-90c6-47dae912c45b - - - - - -] Device tap3200d85d-83 cannot be used as it has no MAC address#033[00m Feb 23 04:57:09 localhost podman[242954]: time="2026-02-23T09:57:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:57:09 localhost nova_compute[282206]: 2026-02-23 09:57:09.397 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:09 localhost kernel: device tap3200d85d-83 entered promiscuous mode Feb 23 04:57:09 localhost podman[242954]: @ - - [23/Feb/2026:09:57:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1" Feb 23 04:57:09 localhost NetworkManager[5974]: [1771840629.4041] manager: (tap3200d85d-83): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Feb 23 04:57:09 localhost ovn_controller[157695]: 2026-02-23T09:57:09Z|00209|binding|INFO|Claiming lport 3200d85d-8350-45a5-8b75-5b140c0b6067 for this chassis. Feb 23 04:57:09 localhost ovn_controller[157695]: 2026-02-23T09:57:09Z|00210|binding|INFO|3200d85d-8350-45a5-8b75-5b140c0b6067: Claiming unknown Feb 23 04:57:09 localhost nova_compute[282206]: 2026-02-23 09:57:09.405 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:09 localhost systemd-udevd[313604]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:57:09 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:09.416 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2e:2224/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b12395f4-8746-4a5f-8dd6-aa83c6decb4b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b12395f4-8746-4a5f-8dd6-aa83c6decb4b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13ab81953d004010a22a72d978d31c4d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8718afba-f509-414c-83e3-3464c2a9d2a2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3200d85d-8350-45a5-8b75-5b140c0b6067) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:09 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:09.418 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 3200d85d-8350-45a5-8b75-5b140c0b6067 in datapath b12395f4-8746-4a5f-8dd6-aa83c6decb4b bound to our chassis#033[00m Feb 23 04:57:09 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:09.422 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4eaffcb3-e5a2-46b4-90f8-068ef050ee16 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 23 04:57:09 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:09.422 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b12395f4-8746-4a5f-8dd6-aa83c6decb4b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:57:09 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:09.423 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[f99000e5-f8dd-43ee-930f-e93a62a43587]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:09 localhost podman[242954]: @ - - [23/Feb/2026:09:57:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19279 "" "Go-http-client/1.1" Feb 23 04:57:09 localhost journal[231253]: ethtool ioctl error on tap3200d85d-83: No such device Feb 23 04:57:09 localhost nova_compute[282206]: 2026-02-23 09:57:09.454 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:09 localhost nova_compute[282206]: 2026-02-23 09:57:09.455 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:09 localhost journal[231253]: ethtool ioctl error on tap3200d85d-83: No such device Feb 23 04:57:09 localhost ovn_controller[157695]: 2026-02-23T09:57:09Z|00211|binding|INFO|Setting lport 3200d85d-8350-45a5-8b75-5b140c0b6067 ovn-installed in OVS Feb 23 04:57:09 localhost ovn_controller[157695]: 2026-02-23T09:57:09Z|00212|binding|INFO|Setting lport 3200d85d-8350-45a5-8b75-5b140c0b6067 up in Southbound Feb 23 04:57:09 localhost nova_compute[282206]: 2026-02-23 09:57:09.460 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:09 localhost journal[231253]: ethtool ioctl error on tap3200d85d-83: No such device Feb 23 04:57:09 localhost journal[231253]: ethtool ioctl error on tap3200d85d-83: No such device Feb 23 04:57:09 localhost journal[231253]: ethtool ioctl error on tap3200d85d-83: No such device Feb 23 04:57:09 localhost journal[231253]: ethtool ioctl error on tap3200d85d-83: No such device Feb 23 04:57:09 localhost journal[231253]: ethtool ioctl error on tap3200d85d-83: No such device Feb 23 04:57:09 localhost journal[231253]: ethtool ioctl error on tap3200d85d-83: No such device Feb 23 04:57:09 localhost nova_compute[282206]: 2026-02-23 09:57:09.491 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:09 localhost nova_compute[282206]: 2026-02-23 09:57:09.524 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:09 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:09.571 265541 INFO neutron.agent.dhcp.agent [None req-03869fb1-26fd-47c1-b63c-0635cf29e0b5 - - - - - -] DHCP configuration for ports {'8cb7a230-2eec-4e99-ae4f-fdf93493284b'} is completed#033[00m Feb 23 04:57:09 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:09.687 2 INFO neutron.agent.securitygroups_rpc [None req-459134b2-4c8a-4e1c-97e8-455a751256d3 d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:57:10 localhost podman[313675]: Feb 23 04:57:10 localhost podman[313675]: 2026-02-23 09:57:10.361913975 +0000 UTC m=+0.086793073 container create c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:57:10 localhost systemd[1]: Started libpod-conmon-c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c.scope. Feb 23 04:57:10 localhost podman[313675]: 2026-02-23 09:57:10.319275758 +0000 UTC m=+0.044154916 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:57:10 localhost systemd[1]: tmp-crun.TmZwYe.mount: Deactivated successfully. Feb 23 04:57:10 localhost systemd[1]: Started libcrun container. Feb 23 04:57:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b96123a8fc3824dd5eb5d08dfdf32746d180cd8d815e2da97bc82b4af6ecaa2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:57:10 localhost podman[313675]: 2026-02-23 09:57:10.451706459 +0000 UTC m=+0.176585567 container init c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:57:10 localhost podman[313675]: 2026-02-23 09:57:10.462159212 +0000 UTC m=+0.187038320 container start c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 04:57:10 localhost dnsmasq[313695]: started, version 2.85 cachesize 150 Feb 23 04:57:10 localhost dnsmasq[313695]: DNS service limited to local subnets Feb 23 04:57:10 localhost dnsmasq[313695]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:57:10 localhost dnsmasq[313695]: warning: no upstream servers configured Feb 23 04:57:10 localhost dnsmasq-dhcp[313695]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:57:10 localhost dnsmasq[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/addn_hosts - 0 addresses Feb 23 04:57:10 localhost dnsmasq-dhcp[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/host Feb 23 04:57:10 localhost dnsmasq-dhcp[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/opts Feb 23 04:57:10 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:10.522 2 INFO neutron.agent.securitygroups_rpc [None req-fe5ed929-0971-4d54-b2a5-f195cec2efa3 9a9c222b96714eb4b3e886d05c8c4dce 8aef873a00904cab867a4692ec3a78cb - - default default] Security group member updated ['58404006-bc14-42d2-ad0c-1fbb19168177']#033[00m Feb 23 04:57:10 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:10.522 265541 INFO neutron.agent.dhcp.agent [None req-172401bf-454c-4035-90c6-47dae912c45b - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:08Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f22d0768-46f3-45fe-bc61-792e40f72919, ip_allocation=immediate, mac_address=fa:16:3e:6e:cd:24, name=tempest-NetworksIpV6TestAttrs-136630378, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:03Z, description=, dns_domain=, id=b12395f4-8746-4a5f-8dd6-aa83c6decb4b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksIpV6TestAttrs-test-network-1394058243, port_security_enabled=True, project_id=13ab81953d004010a22a72d978d31c4d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=26666, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1550, status=ACTIVE, subnets=['6b0f1695-d31e-45f7-9ddd-6677f996c496'], tags=[], tenant_id=13ab81953d004010a22a72d978d31c4d, updated_at=2026-02-23T09:57:05Z, vlan_transparent=None, network_id=b12395f4-8746-4a5f-8dd6-aa83c6decb4b, port_security_enabled=True, project_id=13ab81953d004010a22a72d978d31c4d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['b31260d7-60e9-40f6-abcc-b3b02fd41e3c'], standard_attr_id=1578, status=DOWN, tags=[], tenant_id=13ab81953d004010a22a72d978d31c4d, updated_at=2026-02-23T09:57:09Z on network b12395f4-8746-4a5f-8dd6-aa83c6decb4b#033[00m Feb 23 04:57:10 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:10.593 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:09Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=bbf20927-9fb7-4141-8e3b-4dfa8e35b2b9, ip_allocation=immediate, mac_address=fa:16:3e:1d:8b:96, name=tempest-ExtraDHCPOptionsTestJSON-1081097454, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:02Z, description=, dns_domain=, id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-2021646158, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18544, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1542, status=ACTIVE, subnets=['357cab67-d9ec-4b7b-abb7-813d68b84f46'], tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:03Z, vlan_transparent=None, network_id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['58404006-bc14-42d2-ad0c-1fbb19168177'], standard_attr_id=1580, status=DOWN, tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:09Z on network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7#033[00m Feb 23 04:57:10 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:10.640 265541 INFO neutron.agent.dhcp.agent [None req-194fcf6d-77de-4d38-bb4c-72537dfbac3e - - - - - -] DHCP configuration for ports {'463c3006-8123-4c7a-9689-b280160ef34c'} is completed#033[00m Feb 23 04:57:10 localhost dnsmasq[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/addn_hosts - 1 addresses Feb 23 04:57:10 localhost podman[313715]: 2026-02-23 09:57:10.705029696 +0000 UTC m=+0.054698381 container kill c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:57:10 localhost dnsmasq-dhcp[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/host Feb 23 04:57:10 localhost dnsmasq-dhcp[313695]: read /var/lib/neutron/dhcp/b12395f4-8746-4a5f-8dd6-aa83c6decb4b/opts Feb 23 04:57:10 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:10.713 2 INFO neutron.agent.securitygroups_rpc [None req-3f245d6d-a6d1-496c-9e86-70a6eb49954a d6a332b0f2e446a68dc6c19280644090 2207de28dcd245d2b198a56e6161001a - - default default] Security group member updated ['ba10f066-a353-4b6a-98b8-dab53422ee14']#033[00m Feb 23 04:57:10 localhost dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 2 addresses Feb 23 04:57:10 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host Feb 23 04:57:10 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts Feb 23 04:57:10 localhost podman[313746]: 2026-02-23 09:57:10.832227936 +0000 UTC m=+0.047828899 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:57:10 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:10.880 265541 INFO neutron.agent.dhcp.agent [None req-642c9972-ee2d-40ea-8224-86c8d4e0779b - - - - - -] DHCP configuration for ports {'f22d0768-46f3-45fe-bc61-792e40f72919'} is completed#033[00m Feb 23 04:57:11 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:11.090 265541 INFO neutron.agent.dhcp.agent [None req-546a045c-d9bb-4620-8897-b5ab8e72ae7c - - - - - -] DHCP configuration for ports {'bbf20927-9fb7-4141-8e3b-4dfa8e35b2b9'} is completed#033[00m Feb 23 04:57:11 localhost dnsmasq[313695]: exiting on receipt of SIGTERM Feb 23 04:57:11 localhost podman[313788]: 2026-02-23 09:57:11.299628136 +0000 UTC m=+0.060276083 container kill c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS) Feb 23 04:57:11 localhost systemd[1]: libpod-c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c.scope: Deactivated successfully. Feb 23 04:57:11 localhost systemd[1]: tmp-crun.liA9ph.mount: Deactivated successfully. Feb 23 04:57:11 localhost podman[313801]: 2026-02-23 09:57:11.371378773 +0000 UTC m=+0.055686791 container died c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216) Feb 23 04:57:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c-userdata-shm.mount: Deactivated successfully. Feb 23 04:57:11 localhost systemd[1]: var-lib-containers-storage-overlay-5b96123a8fc3824dd5eb5d08dfdf32746d180cd8d815e2da97bc82b4af6ecaa2-merged.mount: Deactivated successfully. Feb 23 04:57:11 localhost podman[313801]: 2026-02-23 09:57:11.403620209 +0000 UTC m=+0.087928187 container cleanup c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:57:11 localhost systemd[1]: libpod-conmon-c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c.scope: Deactivated successfully. Feb 23 04:57:11 localhost podman[313802]: 2026-02-23 09:57:11.44281531 +0000 UTC m=+0.123696923 container remove c48caa9e2d0eb59ce80d11fce6a50402ac6f80f49b96c9732c49466b0636a48c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b12395f4-8746-4a5f-8dd6-aa83c6decb4b, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:57:11 localhost ovn_controller[157695]: 2026-02-23T09:57:11Z|00213|binding|INFO|Releasing lport 3200d85d-8350-45a5-8b75-5b140c0b6067 from this chassis (sb_readonly=0) Feb 23 04:57:11 localhost nova_compute[282206]: 2026-02-23 09:57:11.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:11 localhost ovn_controller[157695]: 2026-02-23T09:57:11Z|00214|binding|INFO|Setting lport 3200d85d-8350-45a5-8b75-5b140c0b6067 down in Southbound Feb 23 04:57:11 localhost kernel: device tap3200d85d-83 left promiscuous mode Feb 23 04:57:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:11.506 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe2e:2224/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-b12395f4-8746-4a5f-8dd6-aa83c6decb4b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b12395f4-8746-4a5f-8dd6-aa83c6decb4b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '13ab81953d004010a22a72d978d31c4d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8718afba-f509-414c-83e3-3464c2a9d2a2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3200d85d-8350-45a5-8b75-5b140c0b6067) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:11.508 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 3200d85d-8350-45a5-8b75-5b140c0b6067 in datapath b12395f4-8746-4a5f-8dd6-aa83c6decb4b unbound from our chassis#033[00m Feb 23 04:57:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:11.511 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b12395f4-8746-4a5f-8dd6-aa83c6decb4b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:57:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:11.512 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1810e062-aff9-4a21-9182-6e76399d6a0a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:11 localhost nova_compute[282206]: 2026-02-23 09:57:11.517 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:11 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:11.764 265541 INFO neutron.agent.dhcp.agent [None req-29575065-45e4-45d7-be30-be1abb3dc396 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:11 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:11.794 2 INFO neutron.agent.securitygroups_rpc [None req-a8c239ff-03aa-42a5-97c5-b84d2c82f384 9a9c222b96714eb4b3e886d05c8c4dce 8aef873a00904cab867a4692ec3a78cb - - default default] Security group member updated ['58404006-bc14-42d2-ad0c-1fbb19168177']#033[00m Feb 23 04:57:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:12 localhost dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 1 addresses Feb 23 04:57:12 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host Feb 23 04:57:12 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts Feb 23 04:57:12 localhost podman[313847]: 2026-02-23 09:57:12.02901136 +0000 UTC m=+0.056351532 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:57:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:57:12 localhost systemd[1]: run-netns-qdhcp\x2db12395f4\x2d8746\x2d4a5f\x2d8dd6\x2daa83c6decb4b.mount: Deactivated successfully. Feb 23 04:57:12 localhost podman[313869]: 2026-02-23 09:57:12.396146943 +0000 UTC m=+0.075491012 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:57:12 localhost podman[313869]: 2026-02-23 09:57:12.43323695 +0000 UTC m=+0.112581019 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:57:12 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:57:12 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:12.628 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:08Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=8cb7a230-2eec-4e99-ae4f-fdf93493284b, ip_allocation=immediate, mac_address=fa:16:3e:fc:9e:c0, name=tempest-new-port-name-1817703293, network_id=5c327632-3675-4ada-bcc8-d5fb15ecb5d7, port_security_enabled=True, project_id=8aef873a00904cab867a4692ec3a78cb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['58404006-bc14-42d2-ad0c-1fbb19168177'], standard_attr_id=1576, status=DOWN, tags=[], tenant_id=8aef873a00904cab867a4692ec3a78cb, updated_at=2026-02-23T09:57:12Z on network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7#033[00m Feb 23 04:57:12 localhost dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 1 addresses Feb 23 04:57:12 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host Feb 23 04:57:12 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts Feb 23 04:57:12 localhost podman[313903]: 2026-02-23 09:57:12.848216221 +0000 UTC m=+0.060412848 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:13 localhost nova_compute[282206]: 2026-02-23 09:57:13.011 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:13 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:13.086 265541 INFO neutron.agent.dhcp.agent [None req-03682992-3a6f-4acd-a296-ef8b2025e96c - - - - - -] DHCP configuration for ports {'8cb7a230-2eec-4e99-ae4f-fdf93493284b'} is completed#033[00m Feb 23 04:57:13 localhost openstack_network_exporter[245358]: ERROR 09:57:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:57:13 localhost openstack_network_exporter[245358]: Feb 23 04:57:13 localhost openstack_network_exporter[245358]: ERROR 09:57:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:57:13 localhost openstack_network_exporter[245358]: Feb 23 04:57:13 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:13.525 2 INFO neutron.agent.securitygroups_rpc [None req-7a9cb938-6590-4a31-87fa-171ce7ffabc7 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['feaffec7-34aa-4c16-87f3-892bafdc2b78']#033[00m Feb 23 04:57:14 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:14.209 2 INFO neutron.agent.securitygroups_rpc [None req-d957f830-6dae-4ea3-8f5c-2c9a9324e520 9a9c222b96714eb4b3e886d05c8c4dce 8aef873a00904cab867a4692ec3a78cb - - default default] Security group member updated ['58404006-bc14-42d2-ad0c-1fbb19168177']#033[00m Feb 23 04:57:14 localhost dnsmasq[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/addn_hosts - 0 addresses Feb 23 04:57:14 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/host Feb 23 04:57:14 localhost podman[313939]: 2026-02-23 09:57:14.450331679 +0000 UTC m=+0.055868377 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:14 localhost dnsmasq-dhcp[313510]: read /var/lib/neutron/dhcp/5c327632-3675-4ada-bcc8-d5fb15ecb5d7/opts Feb 23 04:57:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:57:14 localhost podman[313959]: 2026-02-23 09:57:14.911468096 +0000 UTC m=+0.085847323 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_managed=true) Feb 23 04:57:14 localhost podman[313959]: 2026-02-23 09:57:14.945370713 +0000 UTC m=+0.119749900 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Feb 23 04:57:14 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:57:15 localhost dnsmasq[313510]: exiting on receipt of SIGTERM Feb 23 04:57:15 localhost podman[313994]: 2026-02-23 09:57:15.21327145 +0000 UTC m=+0.061256173 container kill 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:15 localhost systemd[1]: libpod-01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882.scope: Deactivated successfully. Feb 23 04:57:15 localhost podman[314006]: 2026-02-23 09:57:15.294428148 +0000 UTC m=+0.067508207 container died 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:15 localhost podman[314006]: 2026-02-23 09:57:15.325188658 +0000 UTC m=+0.098268657 container cleanup 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:15 localhost systemd[1]: libpod-conmon-01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882.scope: Deactivated successfully. Feb 23 04:57:15 localhost podman[314008]: 2026-02-23 09:57:15.366724401 +0000 UTC m=+0.132587377 container remove 01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5c327632-3675-4ada-bcc8-d5fb15ecb5d7, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:15 localhost ovn_controller[157695]: 2026-02-23T09:57:15Z|00215|binding|INFO|Releasing lport 73a840c6-4a09-4a70-b6d2-862b377cac9d from this chassis (sb_readonly=0) Feb 23 04:57:15 localhost kernel: device tap73a840c6-4a left promiscuous mode Feb 23 04:57:15 localhost ovn_controller[157695]: 2026-02-23T09:57:15Z|00216|binding|INFO|Setting lport 73a840c6-4a09-4a70-b6d2-862b377cac9d down in Southbound Feb 23 04:57:15 localhost nova_compute[282206]: 2026-02-23 09:57:15.381 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:15.387 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5c327632-3675-4ada-bcc8-d5fb15ecb5d7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c327632-3675-4ada-bcc8-d5fb15ecb5d7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aef873a00904cab867a4692ec3a78cb', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ce95503-eca6-4225-b8ee-cabdbb8ec29c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=73a840c6-4a09-4a70-b6d2-862b377cac9d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:15.389 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 73a840c6-4a09-4a70-b6d2-862b377cac9d in datapath 5c327632-3675-4ada-bcc8-d5fb15ecb5d7 unbound from our chassis#033[00m Feb 23 04:57:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:15.392 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c327632-3675-4ada-bcc8-d5fb15ecb5d7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:57:15 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:15.392 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[cf369efd-107b-423d-b55f-912bb4cd0919]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:15 localhost nova_compute[282206]: 2026-02-23 09:57:15.400 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:15 localhost systemd[1]: var-lib-containers-storage-overlay-658cf474e63c73733b33588ec74e8a0b39a8eb6542d46aec9fd53fcc5cc3218f-merged.mount: Deactivated successfully. Feb 23 04:57:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01161c4a98a010c56a7785a367a22f589e24b894de2e0a91b7997e72600f5882-userdata-shm.mount: Deactivated successfully. Feb 23 04:57:15 localhost systemd[1]: run-netns-qdhcp\x2d5c327632\x2d3675\x2d4ada\x2dbcc8\x2dd5fb15ecb5d7.mount: Deactivated successfully. Feb 23 04:57:15 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:15.625 265541 INFO neutron.agent.dhcp.agent [None req-30e9e5c2-b621-45ea-9d63-28d07266ef16 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:16 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:16.039 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:16 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:16.609 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:17 localhost ovn_controller[157695]: 2026-02-23T09:57:17Z|00217|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:57:17 localhost nova_compute[282206]: 2026-02-23 09:57:17.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:17 localhost sshd[314037]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:57:17 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:17.712 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:31:67 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cdd9294-518a-4cd4-8e14-e309ee77be41, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a147cb7a-5506-4d6a-9946-52357210b7c0) old=Port_Binding(mac=['fa:16:3e:df:31:67 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:17 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:17.715 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a147cb7a-5506-4d6a-9946-52357210b7c0 in datapath c762e206-cc42-4a9e-b8ad-4f8da87fd30e updated#033[00m Feb 23 04:57:17 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:17.718 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c762e206-cc42-4a9e-b8ad-4f8da87fd30e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:57:17 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:17.719 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5af42ff8-fc8a-4135-b1a3-01dd64211291]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:18 localhost nova_compute[282206]: 2026-02-23 09:57:18.015 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0. Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.028231) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46 Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638028269, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2313, "num_deletes": 262, "total_data_size": 2324773, "memory_usage": 2374528, "flush_reason": "Manual Compaction"} Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638038077, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2261070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25262, "largest_seqno": 27574, "table_properties": {"data_size": 2251713, "index_size": 5862, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20328, "raw_average_key_size": 21, "raw_value_size": 2232616, "raw_average_value_size": 2332, "num_data_blocks": 255, "num_entries": 957, "num_filter_entries": 957, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840465, "oldest_key_time": 1771840465, "file_creation_time": 1771840638, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}} Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 9882 microseconds, and 4741 cpu microseconds. Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.038111) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2261070 bytes OK Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.038131) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.042805) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.042819) EVENT_LOG_v1 {"time_micros": 1771840638042814, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.042836) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2315088, prev total WAL file size 2315088, number of live WAL files 2. Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.043562) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end) Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2208KB)], [45(17MB)] Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638043631, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 20138191, "oldest_snapshot_seqno": -1} Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12589 keys, 16805521 bytes, temperature: kUnknown Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638146571, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 16805521, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16732855, "index_size": 40117, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31493, "raw_key_size": 336956, "raw_average_key_size": 26, "raw_value_size": 16517500, "raw_average_value_size": 1312, "num_data_blocks": 1533, "num_entries": 12589, "num_filter_entries": 12589, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840638, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}} Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.146963) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 16805521 bytes Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.149239) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.5 rd, 163.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 17.0 +0.0 blob) out(16.0 +0.0 blob), read-write-amplify(16.3) write-amplify(7.4) OK, records in: 13126, records dropped: 537 output_compression: NoCompression Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.149276) EVENT_LOG_v1 {"time_micros": 1771840638149260, "job": 26, "event": "compaction_finished", "compaction_time_micros": 103025, "compaction_time_cpu_micros": 46138, "output_level": 6, "num_output_files": 1, "total_output_size": 16805521, "num_input_records": 13126, "num_output_records": 12589, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638149845, "job": 26, "event": "table_file_deletion", "file_number": 47} Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840638153152, "job": 26, "event": "table_file_deletion", "file_number": 45} Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.043461) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153326) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:57:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:57:18.153335) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:57:18 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:18.547 2 INFO neutron.agent.securitygroups_rpc [None req-c9ef2d4b-d1c0-41a7-a7dd-2f93a7b3fec0 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['feaffec7-34aa-4c16-87f3-892bafdc2b78', '5909553e-06f7-4a4f-a61c-c51f18e5203a']#033[00m Feb 23 04:57:19 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:19.292 2 INFO neutron.agent.securitygroups_rpc [None req-37a94c58-b8ba-4f2f-a685-f32e299cd9ba 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['5909553e-06f7-4a4f-a61c-c51f18e5203a']#033[00m Feb 23 04:57:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:57:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:57:20 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:57:20 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:57:20 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:20.368 265541 INFO neutron.agent.linux.ip_lib [None req-25e29513-5b46-47b7-a32e-94580f62f62c - - - - - -] Device tapa809c819-39 cannot be used as it has no MAC address#033[00m Feb 23 04:57:20 localhost nova_compute[282206]: 2026-02-23 09:57:20.396 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:20 localhost kernel: device tapa809c819-39 entered promiscuous mode Feb 23 04:57:20 localhost NetworkManager[5974]: [1771840640.4085] manager: (tapa809c819-39): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Feb 23 04:57:20 localhost nova_compute[282206]: 2026-02-23 09:57:20.407 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:20 localhost systemd-udevd[314134]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:57:20 localhost nova_compute[282206]: 2026-02-23 09:57:20.416 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:20 localhost journal[231253]: ethtool ioctl error on tapa809c819-39: No such device Feb 23 04:57:20 localhost journal[231253]: ethtool ioctl error on tapa809c819-39: No such device Feb 23 04:57:20 localhost nova_compute[282206]: 2026-02-23 09:57:20.440 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:20 localhost journal[231253]: ethtool ioctl error on tapa809c819-39: No such device Feb 23 04:57:20 localhost journal[231253]: ethtool ioctl error on tapa809c819-39: No such device Feb 23 04:57:20 localhost journal[231253]: ethtool ioctl error on tapa809c819-39: No such device Feb 23 04:57:20 localhost journal[231253]: ethtool ioctl error on tapa809c819-39: No such device Feb 23 04:57:20 localhost journal[231253]: ethtool ioctl error on tapa809c819-39: No such device Feb 23 04:57:20 localhost journal[231253]: ethtool ioctl error on tapa809c819-39: No such device Feb 23 04:57:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:57:20 localhost nova_compute[282206]: 2026-02-23 09:57:20.484 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:57:20 localhost nova_compute[282206]: 2026-02-23 09:57:20.513 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:57:21 localhost podman[314205]: Feb 23 04:57:21 localhost podman[314205]: 2026-02-23 09:57:21.31476689 +0000 UTC m=+0.090098945 container create d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 04:57:21 localhost systemd[1]: Started libpod-conmon-d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a.scope. Feb 23 04:57:21 localhost podman[314205]: 2026-02-23 09:57:21.270685607 +0000 UTC m=+0.046017662 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:57:21 localhost systemd[1]: tmp-crun.Ys8o94.mount: Deactivated successfully. Feb 23 04:57:21 localhost systemd[1]: Started libcrun container. Feb 23 04:57:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f42a7192ddb811b49db5c05e8b220a055625d570407e7c102749a3e2675b6520/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:57:21 localhost podman[314205]: 2026-02-23 09:57:21.411940732 +0000 UTC m=+0.187272797 container init d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:57:21 localhost podman[314205]: 2026-02-23 09:57:21.420562888 +0000 UTC m=+0.195894943 container start d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:57:21 localhost dnsmasq[314223]: started, version 2.85 cachesize 150 Feb 23 04:57:21 localhost dnsmasq[314223]: DNS service limited to local subnets Feb 23 04:57:21 localhost dnsmasq[314223]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:57:21 localhost dnsmasq[314223]: warning: no upstream servers configured Feb 23 04:57:21 localhost dnsmasq-dhcp[314223]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:57:21 localhost dnsmasq[314223]: read /var/lib/neutron/dhcp/c0f06419-a50d-48bd-89c2-2b77ee893c23/addn_hosts - 0 addresses Feb 23 04:57:21 localhost dnsmasq-dhcp[314223]: read /var/lib/neutron/dhcp/c0f06419-a50d-48bd-89c2-2b77ee893c23/host Feb 23 04:57:21 localhost dnsmasq-dhcp[314223]: read /var/lib/neutron/dhcp/c0f06419-a50d-48bd-89c2-2b77ee893c23/opts Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.455 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.455 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.456 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.456 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:57:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:57:21 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:21.562 265541 INFO neutron.agent.dhcp.agent [None req-6d2f5670-a36c-4149-a35c-fe66671e2499 - - - - - -] DHCP configuration for ports {'bea39c1a-6fe3-4e05-84fb-8c695be45a3d'} is completed#033[00m Feb 23 04:57:21 localhost dnsmasq[314223]: exiting on receipt of SIGTERM Feb 23 04:57:21 localhost systemd[1]: libpod-d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a.scope: Deactivated successfully. Feb 23 04:57:21 localhost podman[314240]: 2026-02-23 09:57:21.666798576 +0000 UTC m=+0.061986406 container kill d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:57:21 localhost podman[314252]: 2026-02-23 09:57:21.737409447 +0000 UTC m=+0.057303431 container died d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0) Feb 23 04:57:21 localhost podman[314252]: 2026-02-23 09:57:21.763433651 +0000 UTC m=+0.083327615 container cleanup d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:57:21 localhost systemd[1]: libpod-conmon-d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a.scope: Deactivated successfully. Feb 23 04:57:21 localhost podman[314254]: 2026-02-23 09:57:21.816423298 +0000 UTC m=+0.129600445 container remove d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0f06419-a50d-48bd-89c2-2b77ee893c23, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.829 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:21 localhost kernel: device tapa809c819-39 left promiscuous mode Feb 23 04:57:21 localhost nova_compute[282206]: 2026-02-23 09:57:21.845 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:21 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:21.864 265541 INFO neutron.agent.dhcp.agent [None req-a6b448ca-e319-42dd-83f5-345c00ce6e49 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:21 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:21.865 265541 INFO neutron.agent.dhcp.agent [None req-a6b448ca-e319-42dd-83f5-345c00ce6e49 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:22 localhost nova_compute[282206]: 2026-02-23 09:57:22.242 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:57:22 localhost nova_compute[282206]: 2026-02-23 09:57:22.272 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:57:22 localhost nova_compute[282206]: 2026-02-23 09:57:22.273 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:57:22 localhost systemd[1]: var-lib-containers-storage-overlay-f42a7192ddb811b49db5c05e8b220a055625d570407e7c102749a3e2675b6520-merged.mount: Deactivated successfully. Feb 23 04:57:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d057d13f16d27b5ec0564cacf762096d87585b298d5c4ad835d7b8e2aefed53a-userdata-shm.mount: Deactivated successfully. Feb 23 04:57:22 localhost systemd[1]: run-netns-qdhcp\x2dc0f06419\x2da50d\x2d48bd\x2d89c2\x2d2b77ee893c23.mount: Deactivated successfully. Feb 23 04:57:23 localhost nova_compute[282206]: 2026-02-23 09:57:23.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:24 localhost nova_compute[282206]: 2026-02-23 09:57:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:24 localhost nova_compute[282206]: 2026-02-23 09:57:24.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:57:24 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:24.753 2 INFO neutron.agent.securitygroups_rpc [None req-3b70b1d6-deb1-439a-bfa9-157fec652445 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['c0f5985b-58d8-49b8-86bf-b98bcb003892']#033[00m Feb 23 04:57:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:57:24 localhost podman[314282]: 2026-02-23 09:57:24.905974922 +0000 UTC m=+0.080783447 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 04:57:24 localhost podman[314282]: 2026-02-23 09:57:24.917484947 +0000 UTC m=+0.092293812 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:57:24 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:57:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e127 do_prune osdmap full prune enabled Feb 23 04:57:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e128 e128: 6 total, 6 up, 6 in Feb 23 04:57:25 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e128: 6 total, 6 up, 6 in Feb 23 04:57:25 localhost nova_compute[282206]: 2026-02-23 09:57:25.057 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:25 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:25.189 2 INFO neutron.agent.securitygroups_rpc [None req-92aad836-c088-4438-915c-a0369f05fd7b 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['c0f5985b-58d8-49b8-86bf-b98bcb003892']#033[00m Feb 23 04:57:25 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:25.801 2 INFO neutron.agent.securitygroups_rpc [None req-76b2a14f-bec0-4305-8207-ebe5d9da3f5c 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['4833eb66-b753-4947-9fd8-0b38ba04d2e6']#033[00m Feb 23 04:57:26 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:26.561 265541 INFO neutron.agent.linux.ip_lib [None req-a47420e0-4163-4fed-9576-7de53f70d6c5 - - - - - -] Device tapb3ce4649-e6 cannot be used as it has no MAC address#033[00m Feb 23 04:57:26 localhost nova_compute[282206]: 2026-02-23 09:57:26.580 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:26 localhost kernel: device tapb3ce4649-e6 entered promiscuous mode Feb 23 04:57:26 localhost ovn_controller[157695]: 2026-02-23T09:57:26Z|00218|binding|INFO|Claiming lport b3ce4649-e63b-416c-91b2-f98efa2cafaa for this chassis. Feb 23 04:57:26 localhost ovn_controller[157695]: 2026-02-23T09:57:26Z|00219|binding|INFO|b3ce4649-e63b-416c-91b2-f98efa2cafaa: Claiming unknown Feb 23 04:57:26 localhost nova_compute[282206]: 2026-02-23 09:57:26.587 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:26 localhost NetworkManager[5974]: [1771840646.5909] manager: (tapb3ce4649-e6): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Feb 23 04:57:26 localhost systemd-udevd[314313]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:57:26 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:26.600 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-bc69bb6e-6101-4390-bb1b-15e16ba6649d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc69bb6e-6101-4390-bb1b-15e16ba6649d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f84e01b3-7a51-49d7-acad-f20a75f6eb9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b3ce4649-e63b-416c-91b2-f98efa2cafaa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:26 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:26.601 163572 INFO neutron.agent.ovn.metadata.agent [-] Port b3ce4649-e63b-416c-91b2-f98efa2cafaa in datapath bc69bb6e-6101-4390-bb1b-15e16ba6649d bound to our chassis#033[00m Feb 23 04:57:26 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:26.602 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network bc69bb6e-6101-4390-bb1b-15e16ba6649d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:57:26 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:26.603 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[4e15933b-ad98-45fe-b495-704e6d07f7d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:26 localhost journal[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device Feb 23 04:57:26 localhost journal[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device Feb 23 04:57:26 localhost ovn_controller[157695]: 2026-02-23T09:57:26Z|00220|binding|INFO|Setting lport b3ce4649-e63b-416c-91b2-f98efa2cafaa ovn-installed in OVS Feb 23 04:57:26 localhost ovn_controller[157695]: 2026-02-23T09:57:26Z|00221|binding|INFO|Setting lport b3ce4649-e63b-416c-91b2-f98efa2cafaa up in Southbound Feb 23 04:57:26 localhost nova_compute[282206]: 2026-02-23 09:57:26.630 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:26 localhost journal[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device Feb 23 04:57:26 localhost journal[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device Feb 23 04:57:26 localhost journal[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device Feb 23 04:57:26 localhost journal[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device Feb 23 04:57:26 localhost journal[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device Feb 23 04:57:26 localhost journal[231253]: ethtool ioctl error on tapb3ce4649-e6: No such device Feb 23 04:57:26 localhost nova_compute[282206]: 2026-02-23 09:57:26.667 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:26 localhost nova_compute[282206]: 2026-02-23 09:57:26.695 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:26 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:26.890 2 INFO neutron.agent.securitygroups_rpc [None req-0bf5f6c8-9298-4358-bc07-570223771cfc 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:27 localhost nova_compute[282206]: 2026-02-23 09:57:27.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:27 localhost nova_compute[282206]: 2026-02-23 09:57:27.443 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:27 localhost podman[314382]: Feb 23 04:57:27 localhost podman[314382]: 2026-02-23 09:57:27.562528036 +0000 UTC m=+0.089799385 container create 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216) Feb 23 04:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:57:27 localhost systemd[1]: Started libpod-conmon-94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424.scope. Feb 23 04:57:27 localhost systemd[1]: tmp-crun.63OFTK.mount: Deactivated successfully. Feb 23 04:57:27 localhost podman[314382]: 2026-02-23 09:57:27.520001773 +0000 UTC m=+0.047273172 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:57:27 localhost systemd[1]: Started libcrun container. Feb 23 04:57:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/06e3421c55fa244bcd15bc7912f7bbd56b819c3903bda310ddc27d1afd8ce870/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:57:27 localhost podman[314382]: 2026-02-23 09:57:27.639465744 +0000 UTC m=+0.166737103 container init 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, io.buildah.version=1.43.0) Feb 23 04:57:27 localhost dnsmasq[314412]: started, version 2.85 cachesize 150 Feb 23 04:57:27 localhost dnsmasq[314412]: DNS service limited to local subnets Feb 23 04:57:27 localhost dnsmasq[314412]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:57:27 localhost dnsmasq[314412]: warning: no upstream servers configured Feb 23 04:57:27 localhost dnsmasq-dhcp[314412]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:57:27 localhost dnsmasq[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/addn_hosts - 0 addresses Feb 23 04:57:27 localhost dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/host Feb 23 04:57:27 localhost dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/opts Feb 23 04:57:27 localhost podman[314397]: 2026-02-23 09:57:27.691960446 +0000 UTC m=+0.092177719 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, release=1770267347) Feb 23 04:57:27 localhost podman[314382]: 2026-02-23 09:57:27.69922851 +0000 UTC m=+0.226499869 container start 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:57:27 localhost podman[314397]: 2026-02-23 09:57:27.72933353 +0000 UTC m=+0.129550773 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.buildah.version=1.33.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal) Feb 23 04:57:27 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:27.734 2 INFO neutron.agent.securitygroups_rpc [None req-f7ddc684-702a-4c23-8012-3babeac19865 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:27 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:57:27 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:27.860 265541 INFO neutron.agent.dhcp.agent [None req-ea245279-e00b-4bbe-be88-7daf5f708a6c - - - - - -] DHCP configuration for ports {'2321dadf-6b0b-4e84-985a-8e2e5348f794'} is completed#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.050 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:28 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:28.074 2 INFO neutron.agent.securitygroups_rpc [None req-40990976-1223-4b33-bf88-75d4d256e3c2 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.079 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.079 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.080 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.102 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.102 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.103 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.103 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.104 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:57:28 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:28.308 2 INFO neutron.agent.securitygroups_rpc [None req-db0e7eff-9ed0-4b03-af25-1355357a4e27 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:28 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:28.564 2 INFO neutron.agent.securitygroups_rpc [None req-31c5f26a-a932-4879-896f-b6452f801d39 3247f4a1ec054de78018b025a6933ab5 13ab81953d004010a22a72d978d31c4d - - default default] Security group member updated ['b31260d7-60e9-40f6-abcc-b3b02fd41e3c']#033[00m Feb 23 04:57:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:57:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/913075697' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:57:28 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:28.579 2 INFO neutron.agent.securitygroups_rpc [None req-6e46d629-2807-49ac-9764-efa5e97c15d9 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.598 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:57:28 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:28.625 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.654 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.655 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.836 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.837 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11351MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.838 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.838 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:57:28 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:28.844 2 INFO neutron.agent.securitygroups_rpc [None req-6e38729b-cba3-4aa6-8317-eef924b8040d 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.941 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.942 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.942 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:57:28 localhost nova_compute[282206]: 2026-02-23 09:57:28.988 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:57:29 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:29.198 2 INFO neutron.agent.securitygroups_rpc [None req-08211a26-5630-4885-b1ea-4c86be58309d 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:57:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2744435039' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:57:29 localhost nova_compute[282206]: 2026-02-23 09:57:29.462 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:57:29 localhost nova_compute[282206]: 2026-02-23 09:57:29.469 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:57:29 localhost nova_compute[282206]: 2026-02-23 09:57:29.500 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:57:29 localhost nova_compute[282206]: 2026-02-23 09:57:29.504 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:57:29 localhost nova_compute[282206]: 2026-02-23 09:57:29.504 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.666s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:57:29 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:29.652 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:31:67 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1cdd9294-518a-4cd4-8e14-e309ee77be41, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a147cb7a-5506-4d6a-9946-52357210b7c0) old=Port_Binding(mac=['fa:16:3e:df:31:67 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c762e206-cc42-4a9e-b8ad-4f8da87fd30e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8532226521ac43ca82723a0b71168e03', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:29 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:29.654 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a147cb7a-5506-4d6a-9946-52357210b7c0 in datapath c762e206-cc42-4a9e-b8ad-4f8da87fd30e updated#033[00m Feb 23 04:57:29 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:29.657 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c762e206-cc42-4a9e-b8ad-4f8da87fd30e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:57:29 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:29.658 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[df72594f-aac3-474f-bfdd-83ee0f141a58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:29 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:29.674 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:29 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:29.848 2 INFO neutron.agent.securitygroups_rpc [None req-73652ffc-e47d-466c-823f-eff68fd895b7 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:30 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:30.143 2 INFO neutron.agent.securitygroups_rpc [None req-fa7b29ab-7d8a-4213-9062-a27a98eacca6 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:31 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:31.273 2 INFO neutron.agent.securitygroups_rpc [None req-9681c33b-6d83-4d64-b717-f3c8315322b0 3247f4a1ec054de78018b025a6933ab5 13ab81953d004010a22a72d978d31c4d - - default default] Security group member updated ['b31260d7-60e9-40f6-abcc-b3b02fd41e3c']#033[00m Feb 23 04:57:31 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:31.403 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:31 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:31.404 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:57:31 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:31.448 2 INFO neutron.agent.securitygroups_rpc [None req-e145e96f-8a3e-446a-b7a0-f3b217974e58 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['170ab605-ad28-44f3-a2b8-68fbb6ed92e0']#033[00m Feb 23 04:57:31 localhost nova_compute[282206]: 2026-02-23 09:57:31.454 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:32 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:32.150 2 INFO neutron.agent.securitygroups_rpc [None req-3adfef6e-10be-4bfb-8ba9-878d58df2fe8 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['dd2fbf50-7988-4b4c-88a5-46b24a60bfee', 'b620ad4c-7f28-46c7-a322-d11687a2bc43', '4833eb66-b753-4947-9fd8-0b38ba04d2e6']#033[00m Feb 23 04:57:32 localhost nova_compute[282206]: 2026-02-23 09:57:32.480 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:57:33 localhost nova_compute[282206]: 2026-02-23 09:57:33.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:33 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:33.636 2 INFO neutron.agent.securitygroups_rpc [None req-e46f16e8-8d8a-4032-be19-52144050a32d 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['dd2fbf50-7988-4b4c-88a5-46b24a60bfee', 'b620ad4c-7f28-46c7-a322-d11687a2bc43']#033[00m Feb 23 04:57:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:33.877 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:33Z, description=, device_id=7647edfc-3361-4769-b093-9cecdc6821d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=fd50e0c4-aa06-4c9b-842f-9aaadd740dc1, ip_allocation=immediate, mac_address=fa:16:3e:1a:97:03, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:22Z, description=, dns_domain=, id=bc69bb6e-6101-4390-bb1b-15e16ba6649d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-336483506, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9099, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1679, status=ACTIVE, subnets=['2defc5bd-3127-4ac7-b565-39a0a9014c59'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:24Z, vlan_transparent=None, network_id=bc69bb6e-6101-4390-bb1b-15e16ba6649d, port_security_enabled=False, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1739, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:33Z on network bc69bb6e-6101-4390-bb1b-15e16ba6649d#033[00m Feb 23 04:57:33 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:33.911 2 INFO neutron.agent.securitygroups_rpc [None req-a5ef7891-4d9f-416c-9e05-7114c855bf40 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['b8e8f331-db67-4dd9-802a-05abdcea8dd4']#033[00m Feb 23 04:57:34 localhost podman[314483]: 2026-02-23 09:57:34.078166842 +0000 UTC m=+0.061277684 container kill 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:57:34 localhost systemd[1]: tmp-crun.gVNq9F.mount: Deactivated successfully. Feb 23 04:57:34 localhost dnsmasq[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/addn_hosts - 1 addresses Feb 23 04:57:34 localhost dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/host Feb 23 04:57:34 localhost dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/opts Feb 23 04:57:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:34.328 265541 INFO neutron.agent.dhcp.agent [None req-74e35aad-a10d-4d66-8adc-b15e3d8d9b18 - - - - - -] DHCP configuration for ports {'fd50e0c4-aa06-4c9b-842f-9aaadd740dc1'} is completed#033[00m Feb 23 04:57:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e128 do_prune osdmap full prune enabled Feb 23 04:57:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e129 e129: 6 total, 6 up, 6 in Feb 23 04:57:34 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e129: 6 total, 6 up, 6 in Feb 23 04:57:34 localhost nova_compute[282206]: 2026-02-23 09:57:34.870 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:35 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:35.069 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:33Z, description=, device_id=7647edfc-3361-4769-b093-9cecdc6821d1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=fd50e0c4-aa06-4c9b-842f-9aaadd740dc1, ip_allocation=immediate, mac_address=fa:16:3e:1a:97:03, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:22Z, description=, dns_domain=, id=bc69bb6e-6101-4390-bb1b-15e16ba6649d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-336483506, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=9099, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1679, status=ACTIVE, subnets=['2defc5bd-3127-4ac7-b565-39a0a9014c59'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:24Z, vlan_transparent=None, network_id=bc69bb6e-6101-4390-bb1b-15e16ba6649d, port_security_enabled=False, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1739, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:33Z on network bc69bb6e-6101-4390-bb1b-15e16ba6649d#033[00m Feb 23 04:57:35 localhost dnsmasq[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/addn_hosts - 1 addresses Feb 23 04:57:35 localhost dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/host Feb 23 04:57:35 localhost dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/opts Feb 23 04:57:35 localhost podman[314521]: 2026-02-23 09:57:35.281703756 +0000 UTC m=+0.062363547 container kill 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 04:57:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e129 do_prune osdmap full prune enabled Feb 23 04:57:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e130 e130: 6 total, 6 up, 6 in Feb 23 04:57:35 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e130: 6 total, 6 up, 6 in Feb 23 04:57:35 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:35.522 265541 INFO neutron.agent.dhcp.agent [None req-a64dc3ee-fcca-4965-b775-95ac3b7231c7 - - - - - -] DHCP configuration for ports {'fd50e0c4-aa06-4c9b-842f-9aaadd740dc1'} is completed#033[00m Feb 23 04:57:35 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:35.814 2 INFO neutron.agent.securitygroups_rpc [None req-db00dc8c-6ada-4161-9538-1913f2da78b1 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['a98db953-42d0-4f19-90d3-a50bfc8bf55e']#033[00m Feb 23 04:57:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e130 do_prune osdmap full prune enabled Feb 23 04:57:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e131 e131: 6 total, 6 up, 6 in Feb 23 04:57:36 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e131: 6 total, 6 up, 6 in Feb 23 04:57:36 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:36.863 2 INFO neutron.agent.securitygroups_rpc [None req-e2d7d9c7-9cf6-4713-8a6e-85619a152752 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['a98db953-42d0-4f19-90d3-a50bfc8bf55e']#033[00m Feb 23 04:57:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:57:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:57:37 localhost podman[314541]: 2026-02-23 09:57:37.918618634 +0000 UTC m=+0.080766936 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller) Feb 23 04:57:37 localhost podman[314541]: 2026-02-23 09:57:37.963554833 +0000 UTC m=+0.125703135 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Feb 23 04:57:37 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:57:38 localhost podman[314542]: 2026-02-23 09:57:37.965827193 +0000 UTC m=+0.123778095 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:57:38 localhost podman[314542]: 2026-02-23 09:57:38.046016261 +0000 UTC m=+0.203967263 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:57:38 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:57:38 localhost nova_compute[282206]: 2026-02-23 09:57:38.059 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:39 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:39.005 2 INFO neutron.agent.securitygroups_rpc [None req-4377d041-aa51-4800-a96f-871d02773dd7 7e2d209f03d04e6d86f4e10f7490cf37 8532226521ac43ca82723a0b71168e03 - - default default] Security group member updated ['709ad995-bfde-4096-a0b4-2ba30248a611']#033[00m Feb 23 04:57:39 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:39.369 2 INFO neutron.agent.securitygroups_rpc [None req-0a6c1cd1-12af-4dd6-b394-41e125fa511a 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['5db8e770-276e-4b00-beb9-c97310b59e62']#033[00m Feb 23 04:57:39 localhost podman[242954]: time="2026-02-23T09:57:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:57:39 localhost podman[242954]: @ - - [23/Feb/2026:09:57:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1" Feb 23 04:57:39 localhost podman[242954]: @ - - [23/Feb/2026:09:57:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19287 "" "Go-http-client/1.1" Feb 23 04:57:39 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:39.752 2 INFO neutron.agent.securitygroups_rpc [None req-bb9f0bb2-0d36-4f02-872c-c348f9c20cb9 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['5db8e770-276e-4b00-beb9-c97310b59e62']#033[00m Feb 23 04:57:40 localhost dnsmasq[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/addn_hosts - 0 addresses Feb 23 04:57:40 localhost dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/host Feb 23 04:57:40 localhost dnsmasq-dhcp[314412]: read /var/lib/neutron/dhcp/bc69bb6e-6101-4390-bb1b-15e16ba6649d/opts Feb 23 04:57:40 localhost podman[314605]: 2026-02-23 09:57:40.348846108 +0000 UTC m=+0.056730684 container kill 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 23 04:57:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:40.406 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:57:40 localhost ovn_controller[157695]: 2026-02-23T09:57:40Z|00222|binding|INFO|Releasing lport b3ce4649-e63b-416c-91b2-f98efa2cafaa from this chassis (sb_readonly=0) Feb 23 04:57:40 localhost kernel: device tapb3ce4649-e6 left promiscuous mode Feb 23 04:57:40 localhost ovn_controller[157695]: 2026-02-23T09:57:40Z|00223|binding|INFO|Setting lport b3ce4649-e63b-416c-91b2-f98efa2cafaa down in Southbound Feb 23 04:57:40 localhost nova_compute[282206]: 2026-02-23 09:57:40.517 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:40.529 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-bc69bb6e-6101-4390-bb1b-15e16ba6649d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc69bb6e-6101-4390-bb1b-15e16ba6649d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f84e01b3-7a51-49d7-acad-f20a75f6eb9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b3ce4649-e63b-416c-91b2-f98efa2cafaa) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:40.531 163572 INFO neutron.agent.ovn.metadata.agent [-] Port b3ce4649-e63b-416c-91b2-f98efa2cafaa in datapath bc69bb6e-6101-4390-bb1b-15e16ba6649d unbound from our chassis#033[00m Feb 23 04:57:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:40.534 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc69bb6e-6101-4390-bb1b-15e16ba6649d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:57:40 localhost nova_compute[282206]: 2026-02-23 09:57:40.536 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:40.536 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5ddbd6da-e2df-4132-a083-423c772717e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:40 localhost ovn_controller[157695]: 2026-02-23T09:57:40Z|00224|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:57:40 localhost nova_compute[282206]: 2026-02-23 09:57:40.589 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:41 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:41.069 2 INFO neutron.agent.securitygroups_rpc [None req-0a45a65f-615e-41b1-9ed5-950f0c0558fe 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']#033[00m Feb 23 04:57:41 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:41.655 2 INFO neutron.agent.securitygroups_rpc [None req-e4560e49-35cf-47a1-b71e-2f4b224de7c3 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']#033[00m Feb 23 04:57:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e131 do_prune osdmap full prune enabled Feb 23 04:57:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e132 e132: 6 total, 6 up, 6 in Feb 23 04:57:42 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e132: 6 total, 6 up, 6 in Feb 23 04:57:42 localhost dnsmasq[314412]: exiting on receipt of SIGTERM Feb 23 04:57:42 localhost podman[314642]: 2026-02-23 09:57:42.077034632 +0000 UTC m=+0.052614537 container kill 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:57:42 localhost systemd[1]: libpod-94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424.scope: Deactivated successfully. Feb 23 04:57:42 localhost podman[314654]: 2026-02-23 09:57:42.123511667 +0000 UTC m=+0.034735903 container died 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:57:42 localhost systemd[1]: tmp-crun.PZaKFQ.mount: Deactivated successfully. Feb 23 04:57:42 localhost podman[314654]: 2026-02-23 09:57:42.149518601 +0000 UTC m=+0.060742827 container cleanup 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216) Feb 23 04:57:42 localhost systemd[1]: libpod-conmon-94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424.scope: Deactivated successfully. Feb 23 04:57:42 localhost podman[314656]: 2026-02-23 09:57:42.238856571 +0000 UTC m=+0.142361960 container remove 94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bc69bb6e-6101-4390-bb1b-15e16ba6649d, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:57:42 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:42.337 2 INFO neutron.agent.securitygroups_rpc [None req-602c32a6-283b-418b-b086-4a732e23eda9 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']#033[00m Feb 23 04:57:42 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:42.613 265541 INFO neutron.agent.dhcp.agent [None req-57e28c15-8e1b-4ab2-9b60-28485b2c37cb - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:42 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:42.776 2 INFO neutron.agent.securitygroups_rpc [None req-dd2fa8de-dc79-4abe-8c24-c96dc8537d15 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']#033[00m Feb 23 04:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:57:42 localhost podman[314680]: 2026-02-23 09:57:42.878160553 +0000 UTC m=+0.079072435 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute) Feb 23 04:57:42 localhost podman[314680]: 2026-02-23 09:57:42.893366533 +0000 UTC m=+0.094278405 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Feb 23 04:57:42 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:57:43 localhost nova_compute[282206]: 2026-02-23 09:57:43.058 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:43 localhost nova_compute[282206]: 2026-02-23 09:57:43.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:43 localhost systemd[1]: var-lib-containers-storage-overlay-06e3421c55fa244bcd15bc7912f7bbd56b819c3903bda310ddc27d1afd8ce870-merged.mount: Deactivated successfully. Feb 23 04:57:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-94fddd27d989ddd5b7ff6b83a6707cecd5d35645999cbfb25441cd52ed72f424-userdata-shm.mount: Deactivated successfully. Feb 23 04:57:43 localhost systemd[1]: run-netns-qdhcp\x2dbc69bb6e\x2d6101\x2d4390\x2dbb1b\x2d15e16ba6649d.mount: Deactivated successfully. Feb 23 04:57:43 localhost openstack_network_exporter[245358]: ERROR 09:57:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:57:43 localhost openstack_network_exporter[245358]: Feb 23 04:57:43 localhost openstack_network_exporter[245358]: ERROR 09:57:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:57:43 localhost openstack_network_exporter[245358]: Feb 23 04:57:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:43.639 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:43 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:43.672 2 INFO neutron.agent.securitygroups_rpc [None req-5c824a03-c3ca-4f5e-acd4-18b4108691ee 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']#033[00m Feb 23 04:57:44 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:44.203 2 INFO neutron.agent.securitygroups_rpc [None req-9496869f-6aca-4c6c-85b6-7d3756f27d34 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['677a40c3-5537-439d-a2b8-f8dc0d2877b8']#033[00m Feb 23 04:57:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e132 do_prune osdmap full prune enabled Feb 23 04:57:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e133 e133: 6 total, 6 up, 6 in Feb 23 04:57:44 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e133: 6 total, 6 up, 6 in Feb 23 04:57:44 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:44.983 2 INFO neutron.agent.securitygroups_rpc [None req-18778c88-b1b9-4b6b-be87-89332e5bf54d 51fa3c9fb33d4d329f38b19fc5b6099c ed50c6ac234342909e3b98d8ad22ae1a - - default default] Security group rule updated ['15676482-e837-4bed-9cab-0aada6b790b9']#033[00m Feb 23 04:57:45 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e133 do_prune osdmap full prune enabled Feb 23 04:57:45 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e134 e134: 6 total, 6 up, 6 in Feb 23 04:57:45 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e134: 6 total, 6 up, 6 in Feb 23 04:57:45 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:45.718 2 INFO neutron.agent.securitygroups_rpc [None req-d9005b4c-4b89-4219-884c-0c4473a3114a 0bcd1a517bf5477491d448b5d8ebf7eb ef475d924469485f883dd5a9d719a22d - - default default] Security group rule updated ['3ef9048d-1c37-421d-bb50-73975b08bdfd']#033[00m Feb 23 04:57:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:57:45 localhost podman[314699]: 2026-02-23 09:57:45.905262026 +0000 UTC m=+0.087154023 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 23 04:57:45 localhost podman[314699]: 2026-02-23 09:57:45.935176761 +0000 UTC m=+0.117068748 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:45 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:57:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:47 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:47.122 2 INFO neutron.agent.securitygroups_rpc [None req-16ac4a4c-9b36-487b-bcc5-bd14fe4ab634 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e134 do_prune osdmap full prune enabled Feb 23 04:57:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e135 e135: 6 total, 6 up, 6 in Feb 23 04:57:47 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e135: 6 total, 6 up, 6 in Feb 23 04:57:47 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:47.529 2 INFO neutron.agent.securitygroups_rpc [None req-28134d87-6144-4ae7-b9ec-7045d33170e4 4e19ac6dec8e40fbad0c3f681ec14665 6aadd525d3dd402cb701922115d00291 - - default default] Security group member updated ['a015e445-a8f1-4c73-9375-43b03b806b24']#033[00m Feb 23 04:57:47 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:47.590 265541 INFO neutron.agent.linux.ip_lib [None req-50dfad2c-221c-4076-8f93-d22270139bf1 - - - - - -] Device tap9fe0c543-bd cannot be used as it has no MAC address#033[00m Feb 23 04:57:47 localhost nova_compute[282206]: 2026-02-23 09:57:47.617 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:47 localhost kernel: device tap9fe0c543-bd entered promiscuous mode Feb 23 04:57:47 localhost ovn_controller[157695]: 2026-02-23T09:57:47Z|00225|binding|INFO|Claiming lport 9fe0c543-bd4c-4645-bfe3-41c26546041f for this chassis. Feb 23 04:57:47 localhost ovn_controller[157695]: 2026-02-23T09:57:47Z|00226|binding|INFO|9fe0c543-bd4c-4645-bfe3-41c26546041f: Claiming unknown Feb 23 04:57:47 localhost nova_compute[282206]: 2026-02-23 09:57:47.627 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:47 localhost NetworkManager[5974]: [1771840667.6297] manager: (tap9fe0c543-bd): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Feb 23 04:57:47 localhost systemd-udevd[314727]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:57:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:47.635 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-003df3c3-8c3a-4476-9564-6a5246acd7a3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-003df3c3-8c3a-4476-9564-6a5246acd7a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d02ccbf-9c2f-41fe-937e-ed56e04b90fb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9fe0c543-bd4c-4645-bfe3-41c26546041f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:47.637 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 9fe0c543-bd4c-4645-bfe3-41c26546041f in datapath 003df3c3-8c3a-4476-9564-6a5246acd7a3 bound to our chassis#033[00m Feb 23 04:57:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:47.639 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5306cfa7-96e5-4d20-b7cc-0e879b95e6f5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 23 04:57:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:47.640 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 003df3c3-8c3a-4476-9564-6a5246acd7a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:57:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:47.640 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c0aeb981-7e96-42a2-a695-36ea1c9215d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:47 localhost journal[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device Feb 23 04:57:47 localhost ovn_controller[157695]: 2026-02-23T09:57:47Z|00227|binding|INFO|Setting lport 9fe0c543-bd4c-4645-bfe3-41c26546041f ovn-installed in OVS Feb 23 04:57:47 localhost ovn_controller[157695]: 2026-02-23T09:57:47Z|00228|binding|INFO|Setting lport 9fe0c543-bd4c-4645-bfe3-41c26546041f up in Southbound Feb 23 04:57:47 localhost nova_compute[282206]: 2026-02-23 09:57:47.663 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:47 localhost journal[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device Feb 23 04:57:47 localhost journal[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device Feb 23 04:57:47 localhost journal[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device Feb 23 04:57:47 localhost journal[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device Feb 23 04:57:47 localhost journal[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device Feb 23 04:57:47 localhost journal[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device Feb 23 04:57:47 localhost journal[231253]: ethtool ioctl error on tap9fe0c543-bd: No such device Feb 23 04:57:47 localhost nova_compute[282206]: 2026-02-23 09:57:47.714 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:47 localhost nova_compute[282206]: 2026-02-23 09:57:47.741 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:48 localhost nova_compute[282206]: 2026-02-23 09:57:48.062 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:48 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:48.167 265541 INFO neutron.agent.linux.ip_lib [None req-fdd09cea-58fb-4227-ad7f-785b75fd49b9 - - - - - -] Device tap908116f5-e2 cannot be used as it has no MAC address#033[00m Feb 23 04:57:48 localhost nova_compute[282206]: 2026-02-23 09:57:48.191 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:48 localhost kernel: device tap908116f5-e2 entered promiscuous mode Feb 23 04:57:48 localhost NetworkManager[5974]: [1771840668.1972] manager: (tap908116f5-e2): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Feb 23 04:57:48 localhost ovn_controller[157695]: 2026-02-23T09:57:48Z|00229|binding|INFO|Claiming lport 908116f5-e230-40e0-818e-5844b37f3a2c for this chassis. Feb 23 04:57:48 localhost ovn_controller[157695]: 2026-02-23T09:57:48Z|00230|binding|INFO|908116f5-e230-40e0-818e-5844b37f3a2c: Claiming unknown Feb 23 04:57:48 localhost nova_compute[282206]: 2026-02-23 09:57:48.199 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:48.212 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c843867f-296c-42fa-9f8c-55712f0f7c56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c843867f-296c-42fa-9f8c-55712f0f7c56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8df8eb84-f846-4c21-a0a4-93d19730bc64, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=908116f5-e230-40e0-818e-5844b37f3a2c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:48.214 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 908116f5-e230-40e0-818e-5844b37f3a2c in datapath c843867f-296c-42fa-9f8c-55712f0f7c56 bound to our chassis#033[00m Feb 23 04:57:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:48.215 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c843867f-296c-42fa-9f8c-55712f0f7c56 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:57:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:48.216 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b29065-bb4e-47cb-aaf2-0ed8f7fde857]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:48 localhost ovn_controller[157695]: 2026-02-23T09:57:48Z|00231|binding|INFO|Setting lport 908116f5-e230-40e0-818e-5844b37f3a2c ovn-installed in OVS Feb 23 04:57:48 localhost ovn_controller[157695]: 2026-02-23T09:57:48Z|00232|binding|INFO|Setting lport 908116f5-e230-40e0-818e-5844b37f3a2c up in Southbound Feb 23 04:57:48 localhost nova_compute[282206]: 2026-02-23 09:57:48.236 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:48 localhost systemd-journald[47710]: Data hash table of /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal has a fill level at 75.0 (53724 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation. Feb 23 04:57:48 localhost systemd-journald[47710]: /run/log/journal/c0212a8b024a111cfc61293864f36c87/system.journal: Journal header limits reached or header out-of-date, rotating. Feb 23 04:57:48 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:57:48 localhost nova_compute[282206]: 2026-02-23 09:57:48.286 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:48 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:48.300 2 INFO neutron.agent.securitygroups_rpc [None req-16297440-fc3c-47fa-be29-1571a923d41c b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:48 localhost nova_compute[282206]: 2026-02-23 09:57:48.327 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:48 localhost rsyslogd[758]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Feb 23 04:57:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:57:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:57:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:48.559 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:57:48 localhost podman[314833]: Feb 23 04:57:48 localhost podman[314833]: 2026-02-23 09:57:48.812460565 +0000 UTC m=+0.095519642 container create bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:48 localhost systemd[1]: Started libpod-conmon-bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79.scope. Feb 23 04:57:48 localhost podman[314833]: 2026-02-23 09:57:48.767850007 +0000 UTC m=+0.050909124 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:57:48 localhost systemd[1]: tmp-crun.sxckDD.mount: Deactivated successfully. Feb 23 04:57:48 localhost systemd[1]: Started libcrun container. Feb 23 04:57:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef8e07bef3889686b85bfa5eac244cc48bec7dc7da70c4883c2434cba185c80b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:57:48 localhost podman[314833]: 2026-02-23 09:57:48.90811565 +0000 UTC m=+0.191174717 container init bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216) Feb 23 04:57:48 localhost podman[314833]: 2026-02-23 09:57:48.917257713 +0000 UTC m=+0.200316780 container start bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:57:48 localhost dnsmasq[314857]: started, version 2.85 cachesize 150 Feb 23 04:57:48 localhost dnsmasq[314857]: DNS service limited to local subnets Feb 23 04:57:48 localhost dnsmasq[314857]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:57:48 localhost dnsmasq[314857]: warning: no upstream servers configured Feb 23 04:57:48 localhost dnsmasq-dhcp[314857]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:57:48 localhost dnsmasq[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/addn_hosts - 0 addresses Feb 23 04:57:48 localhost dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/host Feb 23 04:57:48 localhost dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/opts Feb 23 04:57:48 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:48.973 265541 INFO neutron.agent.dhcp.agent [None req-9f7599b3-6ad5-4943-8708-f6fdebdfcb2c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:47Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=67870819-392f-4850-8a0e-4545b8c9e49b, ip_allocation=immediate, mac_address=fa:16:3e:6d:29:d1, name=tempest-RoutersTest-9259871, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:44Z, description=, dns_domain=, id=003df3c3-8c3a-4476-9564-6a5246acd7a3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1090077587, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11935, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1789, status=ACTIVE, subnets=['21a37bfd-fe0e-4035-abac-601c14169356'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:45Z, vlan_transparent=None, network_id=003df3c3-8c3a-4476-9564-6a5246acd7a3, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a015e445-a8f1-4c73-9375-43b03b806b24'], standard_attr_id=1844, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:47Z on network 003df3c3-8c3a-4476-9564-6a5246acd7a3#033[00m Feb 23 04:57:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e135 do_prune osdmap full prune enabled Feb 23 04:57:49 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:49.063 265541 INFO neutron.agent.dhcp.agent [None req-a52d8f5a-876f-4f5b-96e0-52e204ec051f - - - - - -] DHCP configuration for ports {'2d82a807-e8c0-4bc7-82c9-ebbf52a48105'} is completed#033[00m Feb 23 04:57:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e136 e136: 6 total, 6 up, 6 in Feb 23 04:57:49 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e136: 6 total, 6 up, 6 in Feb 23 04:57:49 localhost dnsmasq[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/addn_hosts - 1 addresses Feb 23 04:57:49 localhost dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/host Feb 23 04:57:49 localhost dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/opts Feb 23 04:57:49 localhost podman[314907]: 2026-02-23 09:57:49.234198315 +0000 UTC m=+0.050487191 container kill bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0) Feb 23 04:57:49 localhost podman[314895]: Feb 23 04:57:49 localhost podman[314895]: 2026-02-23 09:57:49.255286597 +0000 UTC m=+0.130906456 container create 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:57:49 localhost systemd[1]: Started libpod-conmon-61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1.scope. Feb 23 04:57:49 localhost systemd[1]: Started libcrun container. Feb 23 04:57:49 localhost podman[314895]: 2026-02-23 09:57:49.211500053 +0000 UTC m=+0.087119952 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:57:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72bec57338a9b4c2dc713f40ed6af99e9442fa24f4fd216de691c92203a3e570/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:57:49 localhost podman[314895]: 2026-02-23 09:57:49.319453509 +0000 UTC m=+0.195073338 container init 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 04:57:49 localhost podman[314895]: 2026-02-23 09:57:49.327834908 +0000 UTC m=+0.203454757 container start 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216) Feb 23 04:57:49 localhost dnsmasq[314933]: started, version 2.85 cachesize 150 Feb 23 04:57:49 localhost dnsmasq[314933]: DNS service limited to local subnets Feb 23 04:57:49 localhost dnsmasq[314933]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:57:49 localhost dnsmasq[314933]: warning: no upstream servers configured Feb 23 04:57:49 localhost dnsmasq-dhcp[314933]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:57:49 localhost dnsmasq[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/addn_hosts - 0 addresses Feb 23 04:57:49 localhost dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/host Feb 23 04:57:49 localhost dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/opts Feb 23 04:57:49 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:49.545 265541 INFO neutron.agent.dhcp.agent [None req-78f02289-4e63-43d9-9067-891e262dc7a4 - - - - - -] DHCP configuration for ports {'67870819-392f-4850-8a0e-4545b8c9e49b', '8f12aa06-ee55-4f04-a11b-abfcbb418947'} is completed#033[00m Feb 23 04:57:49 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:49.584 2 INFO neutron.agent.securitygroups_rpc [None req-35663660-8b26-425d-b466-39f75aa64f72 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:50 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:50.371 2 INFO neutron.agent.securitygroups_rpc [None req-2cc4babb-57fc-4ffe-921b-a7d431fda5c5 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:51.096 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:57:47Z, description=, device_id=e9c5e1e3-ec18-4894-90a9-76e50883b72e, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=67870819-392f-4850-8a0e-4545b8c9e49b, ip_allocation=immediate, mac_address=fa:16:3e:6d:29:d1, name=tempest-RoutersTest-9259871, network_id=003df3c3-8c3a-4476-9564-6a5246acd7a3, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['a015e445-a8f1-4c73-9375-43b03b806b24'], standard_attr_id=1844, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:57:48Z on network 003df3c3-8c3a-4476-9564-6a5246acd7a3#033[00m Feb 23 04:57:51 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:51.259 2 INFO neutron.agent.securitygroups_rpc [None req-a96e8b20-0ccf-41bb-a529-34f8cbad2842 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:51 localhost podman[314954]: 2026-02-23 09:57:51.318727256 +0000 UTC m=+0.060599682 container kill bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0) Feb 23 04:57:51 localhost dnsmasq[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/addn_hosts - 1 addresses Feb 23 04:57:51 localhost dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/host Feb 23 04:57:51 localhost dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/opts Feb 23 04:57:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:51.534 265541 INFO neutron.agent.dhcp.agent [None req-c91303f1-a104-464c-b45a-58b958b4263a - - - - - -] DHCP configuration for ports {'67870819-392f-4850-8a0e-4545b8c9e49b'} is completed#033[00m Feb 23 04:57:51 localhost nova_compute[282206]: 2026-02-23 09:57:51.954 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e136 do_prune osdmap full prune enabled Feb 23 04:57:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e137 e137: 6 total, 6 up, 6 in Feb 23 04:57:52 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e137: 6 total, 6 up, 6 in Feb 23 04:57:52 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:52.561 265541 INFO neutron.agent.linux.ip_lib [None req-2e7c7154-d0f6-4ae0-9881-fc135e57cc03 - - - - - -] Device tap471e704b-d2 cannot be used as it has no MAC address#033[00m Feb 23 04:57:52 localhost nova_compute[282206]: 2026-02-23 09:57:52.584 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:52 localhost kernel: device tap471e704b-d2 entered promiscuous mode Feb 23 04:57:52 localhost nova_compute[282206]: 2026-02-23 09:57:52.590 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:52 localhost NetworkManager[5974]: [1771840672.5922] manager: (tap471e704b-d2): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Feb 23 04:57:52 localhost systemd-udevd[314985]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:57:52 localhost ovn_controller[157695]: 2026-02-23T09:57:52Z|00233|binding|INFO|Claiming lport 471e704b-d2af-4ed9-bee8-b3da9da96eee for this chassis. Feb 23 04:57:52 localhost ovn_controller[157695]: 2026-02-23T09:57:52Z|00234|binding|INFO|471e704b-d2af-4ed9-bee8-b3da9da96eee: Claiming unknown Feb 23 04:57:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:52.606 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-53593c6a-4c2a-420a-9472-e7be0052fa39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53593c6a-4c2a-420a-9472-e7be0052fa39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d673836f-3caa-47cd-a8d5-bae35d62c0ff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=471e704b-d2af-4ed9-bee8-b3da9da96eee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:52.612 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 471e704b-d2af-4ed9-bee8-b3da9da96eee in datapath 53593c6a-4c2a-420a-9472-e7be0052fa39 bound to our chassis#033[00m Feb 23 04:57:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:52.614 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 53593c6a-4c2a-420a-9472-e7be0052fa39 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:57:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:52.614 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[4f187005-63d5-42f2-b16d-539ef5d8f6ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:52 localhost journal[231253]: ethtool ioctl error on tap471e704b-d2: No such device Feb 23 04:57:52 localhost journal[231253]: ethtool ioctl error on tap471e704b-d2: No such device Feb 23 04:57:52 localhost nova_compute[282206]: 2026-02-23 09:57:52.625 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:52 localhost ovn_controller[157695]: 2026-02-23T09:57:52Z|00235|binding|INFO|Setting lport 471e704b-d2af-4ed9-bee8-b3da9da96eee ovn-installed in OVS Feb 23 04:57:52 localhost ovn_controller[157695]: 2026-02-23T09:57:52Z|00236|binding|INFO|Setting lport 471e704b-d2af-4ed9-bee8-b3da9da96eee up in Southbound Feb 23 04:57:52 localhost nova_compute[282206]: 2026-02-23 09:57:52.630 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:52 localhost journal[231253]: ethtool ioctl error on tap471e704b-d2: No such device Feb 23 04:57:52 localhost nova_compute[282206]: 2026-02-23 09:57:52.633 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:52 localhost journal[231253]: ethtool ioctl error on tap471e704b-d2: No such device Feb 23 04:57:52 localhost journal[231253]: ethtool ioctl error on tap471e704b-d2: No such device Feb 23 04:57:52 localhost journal[231253]: ethtool ioctl error on tap471e704b-d2: No such device Feb 23 04:57:52 localhost journal[231253]: ethtool ioctl error on tap471e704b-d2: No such device Feb 23 04:57:52 localhost journal[231253]: ethtool ioctl error on tap471e704b-d2: No such device Feb 23 04:57:52 localhost nova_compute[282206]: 2026-02-23 09:57:52.672 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:52 localhost nova_compute[282206]: 2026-02-23 09:57:52.700 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:52 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:52.710 2 INFO neutron.agent.securitygroups_rpc [None req-6e03d4f3-0be2-400e-8da5-d3c25ea65d96 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:52 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:52.865 2 INFO neutron.agent.securitygroups_rpc [None req-36c16d45-719d-4f5a-b525-80c8ec71acee 4e19ac6dec8e40fbad0c3f681ec14665 6aadd525d3dd402cb701922115d00291 - - default default] Security group member updated ['a015e445-a8f1-4c73-9375-43b03b806b24']#033[00m Feb 23 04:57:53 localhost nova_compute[282206]: 2026-02-23 09:57:53.077 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:53 localhost systemd[1]: tmp-crun.YfzErZ.mount: Deactivated successfully. Feb 23 04:57:53 localhost dnsmasq[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/addn_hosts - 0 addresses Feb 23 04:57:53 localhost dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/host Feb 23 04:57:53 localhost dnsmasq-dhcp[314857]: read /var/lib/neutron/dhcp/003df3c3-8c3a-4476-9564-6a5246acd7a3/opts Feb 23 04:57:53 localhost podman[315046]: 2026-02-23 09:57:53.130182573 +0000 UTC m=+0.074106470 container kill bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:57:53 localhost ovn_controller[157695]: 2026-02-23T09:57:53Z|00237|binding|INFO|Releasing lport 9fe0c543-bd4c-4645-bfe3-41c26546041f from this chassis (sb_readonly=0) Feb 23 04:57:53 localhost ovn_controller[157695]: 2026-02-23T09:57:53Z|00238|binding|INFO|Setting lport 9fe0c543-bd4c-4645-bfe3-41c26546041f down in Southbound Feb 23 04:57:53 localhost nova_compute[282206]: 2026-02-23 09:57:53.299 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:53 localhost kernel: device tap9fe0c543-bd left promiscuous mode Feb 23 04:57:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:53.309 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-003df3c3-8c3a-4476-9564-6a5246acd7a3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-003df3c3-8c3a-4476-9564-6a5246acd7a3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7d02ccbf-9c2f-41fe-937e-ed56e04b90fb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9fe0c543-bd4c-4645-bfe3-41c26546041f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:57:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:53.311 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 9fe0c543-bd4c-4645-bfe3-41c26546041f in datapath 003df3c3-8c3a-4476-9564-6a5246acd7a3 unbound from our chassis#033[00m Feb 23 04:57:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:53.315 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 003df3c3-8c3a-4476-9564-6a5246acd7a3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:57:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:57:53.316 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef40639-3f1f-4798-8932-e713d58f1867]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:57:53 localhost nova_compute[282206]: 2026-02-23 09:57:53.318 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:53 localhost podman[315097]: Feb 23 04:57:53 localhost podman[315097]: 2026-02-23 09:57:53.526461236 +0000 UTC m=+0.093695956 container create 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:53 localhost systemd[1]: Started libpod-conmon-372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf.scope. Feb 23 04:57:53 localhost systemd[1]: Started libcrun container. Feb 23 04:57:53 localhost podman[315097]: 2026-02-23 09:57:53.480425344 +0000 UTC m=+0.047660094 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:57:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5440e0fd82ce5dacc9263097b3678f8d36f3eeeec3a08ac5ea46b12225532482/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:57:53 localhost podman[315097]: 2026-02-23 09:57:53.591142355 +0000 UTC m=+0.158377055 container init 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Feb 23 04:57:53 localhost podman[315097]: 2026-02-23 09:57:53.599508043 +0000 UTC m=+0.166742743 container start 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:57:53 localhost dnsmasq[315116]: started, version 2.85 cachesize 150 Feb 23 04:57:53 localhost dnsmasq[315116]: DNS service limited to local subnets Feb 23 04:57:53 localhost dnsmasq[315116]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:57:53 localhost dnsmasq[315116]: warning: no upstream servers configured Feb 23 04:57:53 localhost dnsmasq-dhcp[315116]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 04:57:53 localhost dnsmasq[315116]: read /var/lib/neutron/dhcp/53593c6a-4c2a-420a-9472-e7be0052fa39/addn_hosts - 0 addresses Feb 23 04:57:53 localhost dnsmasq-dhcp[315116]: read /var/lib/neutron/dhcp/53593c6a-4c2a-420a-9472-e7be0052fa39/host Feb 23 04:57:53 localhost dnsmasq-dhcp[315116]: read /var/lib/neutron/dhcp/53593c6a-4c2a-420a-9472-e7be0052fa39/opts Feb 23 04:57:54 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:54.028 265541 INFO neutron.agent.dhcp.agent [None req-89831f41-dbfb-4976-8fd0-345c397ca2d8 - - - - - -] DHCP configuration for ports {'f831d45b-9e7e-481e-9011-7f9545ba3cff'} is completed#033[00m Feb 23 04:57:55 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:55.154 2 INFO neutron.agent.securitygroups_rpc [None req-518b7324-91f0-437a-87bd-ce34c1a3be1e e22ee96829d64023b04af5ccfdd0ab53 a3622447b13c4164ad418e851634e3b3 - - default default] Security group member updated ['b06f7d0b-a9fc-4c26-994a-bc68e12c2cf6']#033[00m Feb 23 04:57:55 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:55.219 2 INFO neutron.agent.securitygroups_rpc [None req-1ba87428-a084-4aaf-9c26-da4479389d22 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:55 localhost nova_compute[282206]: 2026-02-23 09:57:55.449 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:57:55 localhost podman[315135]: 2026-02-23 09:57:55.873007644 +0000 UTC m=+0.060645904 container kill bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 23 04:57:55 localhost dnsmasq[314857]: exiting on receipt of SIGTERM Feb 23 04:57:55 localhost systemd[1]: libpod-bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79.scope: Deactivated successfully. Feb 23 04:57:55 localhost systemd[1]: tmp-crun.4AS6Cl.mount: Deactivated successfully. Feb 23 04:57:55 localhost podman[315141]: 2026-02-23 09:57:55.919976306 +0000 UTC m=+0.093197790 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:57:55 localhost podman[315158]: 2026-02-23 09:57:55.95118514 +0000 UTC m=+0.061335876 container died bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:57:55 localhost podman[315158]: 2026-02-23 09:57:55.990845235 +0000 UTC m=+0.100995931 container cleanup bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:57:55 localhost systemd[1]: libpod-conmon-bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79.scope: Deactivated successfully. Feb 23 04:57:56 localhost podman[315160]: 2026-02-23 09:57:56.009708338 +0000 UTC m=+0.112655771 container remove bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-003df3c3-8c3a-4476-9564-6a5246acd7a3, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 23 04:57:56 localhost podman[315141]: 2026-02-23 09:57:56.022683489 +0000 UTC m=+0.195904933 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:57:56 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:57:56 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:56.068 265541 INFO neutron.agent.dhcp.agent [None req-2dfe25f0-f3d6-4e3a-b631-7966a74d8b02 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:56 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:57:56.783 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:57:56 localhost systemd[1]: tmp-crun.M72uVd.mount: Deactivated successfully. Feb 23 04:57:56 localhost systemd[1]: var-lib-containers-storage-overlay-ef8e07bef3889686b85bfa5eac244cc48bec7dc7da70c4883c2434cba185c80b-merged.mount: Deactivated successfully. Feb 23 04:57:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bfce5c19dc23ffedf53633ad7904ac4bc4ec469f4af130c1a3d18600db50fa79-userdata-shm.mount: Deactivated successfully. Feb 23 04:57:56 localhost systemd[1]: run-netns-qdhcp\x2d003df3c3\x2d8c3a\x2d4476\x2d9564\x2d6a5246acd7a3.mount: Deactivated successfully. Feb 23 04:57:57 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:57.022 2 INFO neutron.agent.securitygroups_rpc [None req-758250a6-9620-43e4-8e85-56ce0e87f261 b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:57:57 localhost ovn_controller[157695]: 2026-02-23T09:57:57Z|00239|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:57:57 localhost nova_compute[282206]: 2026-02-23 09:57:57.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:57:57 localhost podman[315194]: 2026-02-23 09:57:57.905863551 +0000 UTC m=+0.079503118 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64) Feb 23 04:57:57 localhost podman[315194]: 2026-02-23 09:57:57.919479952 +0000 UTC m=+0.093119479 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 04:57:57 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:57:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e137 do_prune osdmap full prune enabled Feb 23 04:57:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e138 e138: 6 total, 6 up, 6 in Feb 23 04:57:58 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e138: 6 total, 6 up, 6 in Feb 23 04:57:58 localhost nova_compute[282206]: 2026-02-23 09:57:58.113 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:57:58 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:58.662 2 INFO neutron.agent.securitygroups_rpc [None req-90d7272c-2a16-4c66-834f-ae9e72b06d5d b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:57:58 localhost neutron_sriov_agent[258207]: 2026-02-23 09:57:58.724 2 INFO neutron.agent.securitygroups_rpc [None req-92d71163-849e-49a8-9e78-04255fc35661 e22ee96829d64023b04af5ccfdd0ab53 a3622447b13c4164ad418e851634e3b3 - - default default] Security group member updated ['b06f7d0b-a9fc-4c26-994a-bc68e12c2cf6']#033[00m Feb 23 04:58:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:58:00 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3705593809' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:58:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:58:00 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3705593809' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:58:01 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:01.242 2 INFO neutron.agent.securitygroups_rpc [None req-28ec64b1-e1b5-4dd6-940d-987b4dc3aa1e b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:58:01 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:01.946 2 INFO neutron.agent.securitygroups_rpc [None req-1398d805-3ae6-40de-8906-b5cfcdc73dab b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:58:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e138 do_prune osdmap full prune enabled Feb 23 04:58:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e139 e139: 6 total, 6 up, 6 in Feb 23 04:58:02 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e139: 6 total, 6 up, 6 in Feb 23 04:58:02 localhost ovn_controller[157695]: 2026-02-23T09:58:02Z|00240|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:58:02 localhost nova_compute[282206]: 2026-02-23 09:58:02.169 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:02 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:02.199 265541 INFO neutron.agent.linux.ip_lib [None req-a9c48374-050a-46cf-8cb2-df1ce0cb8828 - - - - - -] Device tap756bf056-72 cannot be used as it has no MAC address#033[00m Feb 23 04:58:02 localhost nova_compute[282206]: 2026-02-23 09:58:02.220 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:02 localhost kernel: device tap756bf056-72 entered promiscuous mode Feb 23 04:58:02 localhost NetworkManager[5974]: [1771840682.2283] manager: (tap756bf056-72): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Feb 23 04:58:02 localhost ovn_controller[157695]: 2026-02-23T09:58:02Z|00241|binding|INFO|Claiming lport 756bf056-7233-47b6-8872-465c3e350d26 for this chassis. Feb 23 04:58:02 localhost nova_compute[282206]: 2026-02-23 09:58:02.229 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:02 localhost ovn_controller[157695]: 2026-02-23T09:58:02Z|00242|binding|INFO|756bf056-7233-47b6-8872-465c3e350d26: Claiming unknown Feb 23 04:58:02 localhost systemd-udevd[315225]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:58:02 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:02.247 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43874ae-5b5b-4695-9210-e284b1e04551, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=756bf056-7233-47b6-8872-465c3e350d26) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:02 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:02.249 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 756bf056-7233-47b6-8872-465c3e350d26 in datapath 74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd bound to our chassis#033[00m Feb 23 04:58:02 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:02.251 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:58:02 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:02.252 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[f6cff029-ce32-4a9a-91e8-28550280e4a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:02 localhost journal[231253]: ethtool ioctl error on tap756bf056-72: No such device Feb 23 04:58:02 localhost journal[231253]: ethtool ioctl error on tap756bf056-72: No such device Feb 23 04:58:02 localhost nova_compute[282206]: 2026-02-23 09:58:02.263 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:02 localhost ovn_controller[157695]: 2026-02-23T09:58:02Z|00243|binding|INFO|Setting lport 756bf056-7233-47b6-8872-465c3e350d26 ovn-installed in OVS Feb 23 04:58:02 localhost journal[231253]: ethtool ioctl error on tap756bf056-72: No such device Feb 23 04:58:02 localhost ovn_controller[157695]: 2026-02-23T09:58:02Z|00244|binding|INFO|Setting lport 756bf056-7233-47b6-8872-465c3e350d26 up in Southbound Feb 23 04:58:02 localhost nova_compute[282206]: 2026-02-23 09:58:02.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:02 localhost journal[231253]: ethtool ioctl error on tap756bf056-72: No such device Feb 23 04:58:02 localhost journal[231253]: ethtool ioctl error on tap756bf056-72: No such device Feb 23 04:58:02 localhost journal[231253]: ethtool ioctl error on tap756bf056-72: No such device Feb 23 04:58:02 localhost journal[231253]: ethtool ioctl error on tap756bf056-72: No such device Feb 23 04:58:02 localhost journal[231253]: ethtool ioctl error on tap756bf056-72: No such device Feb 23 04:58:02 localhost nova_compute[282206]: 2026-02-23 09:58:02.302 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:02 localhost nova_compute[282206]: 2026-02-23 09:58:02.327 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:02 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:02.794 2 INFO neutron.agent.securitygroups_rpc [None req-7618cb81-6bfd-47c7-b5cf-c5e505798fed b34a74d70b104d598259f3881ae86305 8635084f010e445d861ab634b753fa27 - - default default] Security group member updated ['80d4c661-e254-4356-81d1-bc4c19a37e6b']#033[00m Feb 23 04:58:03 localhost nova_compute[282206]: 2026-02-23 09:58:03.116 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:03 localhost podman[315296]: Feb 23 04:58:03 localhost podman[315296]: 2026-02-23 09:58:03.133269796 +0000 UTC m=+0.100605380 container create 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Feb 23 04:58:03 localhost podman[315296]: 2026-02-23 09:58:03.084039564 +0000 UTC m=+0.051375178 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:58:03 localhost systemd[1]: Started libpod-conmon-8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1.scope. Feb 23 04:58:03 localhost systemd[1]: tmp-crun.KE978s.mount: Deactivated successfully. Feb 23 04:58:03 localhost systemd[1]: Started libcrun container. Feb 23 04:58:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52c67337069f1c964652cb1231caff2024e0c511d433352b5a89e8eb28d84b9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:58:03 localhost podman[315296]: 2026-02-23 09:58:03.22791591 +0000 UTC m=+0.195251494 container init 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:03 localhost podman[315296]: 2026-02-23 09:58:03.240578131 +0000 UTC m=+0.207913715 container start 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:03 localhost dnsmasq[315315]: started, version 2.85 cachesize 150 Feb 23 04:58:03 localhost dnsmasq[315315]: DNS service limited to local subnets Feb 23 04:58:03 localhost dnsmasq[315315]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:58:03 localhost dnsmasq[315315]: warning: no upstream servers configured Feb 23 04:58:03 localhost dnsmasq-dhcp[315315]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Feb 23 04:58:03 localhost dnsmasq[315315]: read /var/lib/neutron/dhcp/74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd/addn_hosts - 0 addresses Feb 23 04:58:03 localhost dnsmasq-dhcp[315315]: read /var/lib/neutron/dhcp/74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd/host Feb 23 04:58:03 localhost dnsmasq-dhcp[315315]: read /var/lib/neutron/dhcp/74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd/opts Feb 23 04:58:03 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:03.348 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:02Z, description=, device_id=719d7925-a22d-417d-9a79-c8610838b916, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1d3dbdf1-1d97-4ebd-a95e-1fc3ea0ad5d3, ip_allocation=immediate, mac_address=fa:16:3e:75:1a:d0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:45Z, description=, dns_domain=, id=c843867f-296c-42fa-9f8c-55712f0f7c56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1442624154, port_security_enabled=True, project_id=acbc03f0564045b8857e1689cfa4a66d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35057, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1819, status=ACTIVE, subnets=['394dad53-0de3-4125-ba86-854949b31577'], tags=[], tenant_id=acbc03f0564045b8857e1689cfa4a66d, updated_at=2026-02-23T09:57:46Z, vlan_transparent=None, network_id=c843867f-296c-42fa-9f8c-55712f0f7c56, port_security_enabled=False, project_id=acbc03f0564045b8857e1689cfa4a66d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1940, status=DOWN, tags=[], tenant_id=acbc03f0564045b8857e1689cfa4a66d, updated_at=2026-02-23T09:58:02Z on network c843867f-296c-42fa-9f8c-55712f0f7c56#033[00m Feb 23 04:58:03 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:03.427 265541 INFO neutron.agent.dhcp.agent [None req-0652b052-b4c2-454d-a519-9d07e54daee5 - - - - - -] DHCP configuration for ports {'b99e0923-3e0d-4d50-bd39-8a10af9a0874'} is completed#033[00m Feb 23 04:58:03 localhost dnsmasq[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/addn_hosts - 1 addresses Feb 23 04:58:03 localhost dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/host Feb 23 04:58:03 localhost podman[315334]: 2026-02-23 09:58:03.538762894 +0000 UTC m=+0.057513228 container kill 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:58:03 localhost dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/opts Feb 23 04:58:03 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:03.773 265541 INFO neutron.agent.dhcp.agent [None req-83a05713-f416-409a-b6dd-4ca92f3b3057 - - - - - -] DHCP configuration for ports {'1d3dbdf1-1d97-4ebd-a95e-1fc3ea0ad5d3'} is completed#033[00m Feb 23 04:58:04 localhost sshd[315354]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:58:04 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:04.885 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:02Z, description=, device_id=719d7925-a22d-417d-9a79-c8610838b916, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1d3dbdf1-1d97-4ebd-a95e-1fc3ea0ad5d3, ip_allocation=immediate, mac_address=fa:16:3e:75:1a:d0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:57:45Z, description=, dns_domain=, id=c843867f-296c-42fa-9f8c-55712f0f7c56, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1442624154, port_security_enabled=True, project_id=acbc03f0564045b8857e1689cfa4a66d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35057, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1819, status=ACTIVE, subnets=['394dad53-0de3-4125-ba86-854949b31577'], tags=[], tenant_id=acbc03f0564045b8857e1689cfa4a66d, updated_at=2026-02-23T09:57:46Z, vlan_transparent=None, network_id=c843867f-296c-42fa-9f8c-55712f0f7c56, port_security_enabled=False, project_id=acbc03f0564045b8857e1689cfa4a66d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1940, status=DOWN, tags=[], tenant_id=acbc03f0564045b8857e1689cfa4a66d, updated_at=2026-02-23T09:58:02Z on network c843867f-296c-42fa-9f8c-55712f0f7c56#033[00m Feb 23 04:58:05 localhost podman[315374]: 2026-02-23 09:58:05.084775128 +0000 UTC m=+0.058216189 container kill 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:58:05 localhost systemd[1]: tmp-crun.YXd9kp.mount: Deactivated successfully. Feb 23 04:58:05 localhost dnsmasq[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/addn_hosts - 1 addresses Feb 23 04:58:05 localhost dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/host Feb 23 04:58:05 localhost dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/opts Feb 23 04:58:05 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:05.336 265541 INFO neutron.agent.dhcp.agent [None req-26abc0b7-b08d-45f7-9c29-51dcd8441aa9 - - - - - -] DHCP configuration for ports {'1d3dbdf1-1d97-4ebd-a95e-1fc3ea0ad5d3'} is completed#033[00m Feb 23 04:58:06 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:06.389 2 INFO neutron.agent.securitygroups_rpc [None req-f1c412d9-695e-47a0-9d6e-ba30fd3bc526 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:07 localhost dnsmasq[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/addn_hosts - 0 addresses Feb 23 04:58:07 localhost dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/host Feb 23 04:58:07 localhost podman[315411]: 2026-02-23 09:58:07.142255406 +0000 UTC m=+0.058726686 container kill 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:07 localhost dnsmasq-dhcp[314933]: read /var/lib/neutron/dhcp/c843867f-296c-42fa-9f8c-55712f0f7c56/opts Feb 23 04:58:07 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:07.157 2 INFO neutron.agent.securitygroups_rpc [None req-a9451f2f-78ed-41db-9d83-c983c36607eb 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:07 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:07.279 2 INFO neutron.agent.securitygroups_rpc [None req-c82f4f13-a6c6-4dfc-aae6-5892f71ca6d5 8ff2abb777c74a6dbae4721d46f0d17a 182b0ebb06754cfab10ebabcdf7056ed - - default default] Security group member updated ['c029b069-aec5-44a4-9af0-e58cbf64895c']#033[00m Feb 23 04:58:07 localhost nova_compute[282206]: 2026-02-23 09:58:07.296 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:07 localhost kernel: device tap908116f5-e2 left promiscuous mode Feb 23 04:58:07 localhost ovn_controller[157695]: 2026-02-23T09:58:07Z|00245|binding|INFO|Releasing lport 908116f5-e230-40e0-818e-5844b37f3a2c from this chassis (sb_readonly=0) Feb 23 04:58:07 localhost ovn_controller[157695]: 2026-02-23T09:58:07Z|00246|binding|INFO|Setting lport 908116f5-e230-40e0-818e-5844b37f3a2c down in Southbound Feb 23 04:58:07 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:07.309 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-c843867f-296c-42fa-9f8c-55712f0f7c56', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c843867f-296c-42fa-9f8c-55712f0f7c56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8df8eb84-f846-4c21-a0a4-93d19730bc64, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=908116f5-e230-40e0-818e-5844b37f3a2c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:07 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:07.311 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 908116f5-e230-40e0-818e-5844b37f3a2c in datapath c843867f-296c-42fa-9f8c-55712f0f7c56 unbound from our chassis#033[00m Feb 23 04:58:07 localhost nova_compute[282206]: 2026-02-23 09:58:07.311 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:07 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:07.313 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c843867f-296c-42fa-9f8c-55712f0f7c56 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:58:07 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:07.314 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[130a033a-f4b0-405c-b683-60659a6f8f91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:08 localhost dnsmasq[315315]: exiting on receipt of SIGTERM Feb 23 04:58:08 localhost podman[315448]: 2026-02-23 09:58:08.012083439 +0000 UTC m=+0.060374845 container kill 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:58:08 localhost systemd[1]: libpod-8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1.scope: Deactivated successfully. Feb 23 04:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:58:08 localhost podman[315461]: 2026-02-23 09:58:08.091904496 +0000 UTC m=+0.064852915 container died 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:58:08 localhost nova_compute[282206]: 2026-02-23 09:58:08.119 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:58:08 localhost podman[315469]: 2026-02-23 09:58:08.154464088 +0000 UTC m=+0.106113369 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:08 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1-userdata-shm.mount: Deactivated successfully. Feb 23 04:58:08 localhost systemd[1]: var-lib-containers-storage-overlay-52c67337069f1c964652cb1231caff2024e0c511d433352b5a89e8eb28d84b9a-merged.mount: Deactivated successfully. Feb 23 04:58:08 localhost podman[315469]: 2026-02-23 09:58:08.238102433 +0000 UTC m=+0.189751694 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 04:58:08 localhost nova_compute[282206]: 2026-02-23 09:58:08.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:08.239 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port d5af4bb4-2866-499e-8695-7ba6d053f969 with type ""#033[00m Feb 23 04:58:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:08.242 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e43874ae-5b5b-4695-9210-e284b1e04551, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=756bf056-7233-47b6-8872-465c3e350d26) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:08 localhost ovn_controller[157695]: 2026-02-23T09:58:08Z|00247|binding|INFO|Removing iface tap756bf056-72 ovn-installed in OVS Feb 23 04:58:08 localhost ovn_controller[157695]: 2026-02-23T09:58:08Z|00248|binding|INFO|Removing lport 756bf056-7233-47b6-8872-465c3e350d26 ovn-installed in OVS Feb 23 04:58:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:08.246 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 756bf056-7233-47b6-8872-465c3e350d26 in datapath 74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd unbound from our chassis#033[00m Feb 23 04:58:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:08.249 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:58:08 localhost nova_compute[282206]: 2026-02-23 09:58:08.250 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:08 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:08.250 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8d9e11-606a-4dac-9453-bbe5857f5a69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:08 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:58:08 localhost podman[315461]: 2026-02-23 09:58:08.293833644 +0000 UTC m=+0.266782023 container cleanup 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:58:08 localhost systemd[1]: libpod-conmon-8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1.scope: Deactivated successfully. Feb 23 04:58:08 localhost podman[315463]: 2026-02-23 09:58:08.318556769 +0000 UTC m=+0.282517130 container remove 8dacfa928e6d23d5b0bcb26e6023e866c9cfe6d3eca9d6a0f3b0e7b2de68dfd1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-74bfca51-6a6b-4fa1-bb91-4bfc9a96dacd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:58:08 localhost nova_compute[282206]: 2026-02-23 09:58:08.334 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:08 localhost kernel: device tap756bf056-72 left promiscuous mode Feb 23 04:58:08 localhost podman[315497]: 2026-02-23 09:58:08.243144929 +0000 UTC m=+0.111389793 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:58:08 localhost nova_compute[282206]: 2026-02-23 09:58:08.351 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:08 localhost podman[315497]: 2026-02-23 09:58:08.376389265 +0000 UTC m=+0.244634169 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:58:08 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:08.386 265541 INFO neutron.agent.dhcp.agent [None req-4bd86484-d733-437f-acc1-bf3cc7a70cbc - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:08 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:58:08 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:08.660 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:09 localhost ovn_controller[157695]: 2026-02-23T09:58:09Z|00249|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:58:09 localhost nova_compute[282206]: 2026-02-23 09:58:09.138 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:09 localhost systemd[1]: run-netns-qdhcp\x2d74bfca51\x2d6a6b\x2d4fa1\x2dbb91\x2d4bfc9a96dacd.mount: Deactivated successfully. Feb 23 04:58:09 localhost podman[242954]: time="2026-02-23T09:58:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:58:09 localhost podman[242954]: @ - - [23/Feb/2026:09:58:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160715 "" "Go-http-client/1.1" Feb 23 04:58:09 localhost podman[242954]: @ - - [23/Feb/2026:09:58:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19762 "" "Go-http-client/1.1" Feb 23 04:58:09 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:09.760 2 INFO neutron.agent.securitygroups_rpc [None req-32f4623b-5152-4fb4-8665-550d3831cd54 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e139 do_prune osdmap full prune enabled Feb 23 04:58:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e140 e140: 6 total, 6 up, 6 in Feb 23 04:58:10 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e140: 6 total, 6 up, 6 in Feb 23 04:58:10 localhost podman[315554]: 2026-02-23 09:58:10.299539032 +0000 UTC m=+0.069734816 container kill 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0) Feb 23 04:58:10 localhost dnsmasq[315116]: exiting on receipt of SIGTERM Feb 23 04:58:10 localhost systemd[1]: libpod-372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf.scope: Deactivated successfully. Feb 23 04:58:10 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:10.351 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:10 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:10.352 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:58:10 localhost nova_compute[282206]: 2026-02-23 09:58:10.384 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:10 localhost podman[315569]: 2026-02-23 09:58:10.390458531 +0000 UTC m=+0.069697654 container died 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0) Feb 23 04:58:10 localhost systemd[1]: tmp-crun.390MTa.mount: Deactivated successfully. Feb 23 04:58:10 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf-userdata-shm.mount: Deactivated successfully. Feb 23 04:58:10 localhost podman[315569]: 2026-02-23 09:58:10.435645457 +0000 UTC m=+0.114884550 container remove 372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-53593c6a-4c2a-420a-9472-e7be0052fa39, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:58:10 localhost systemd[1]: libpod-conmon-372b1b70d42257631680f0e1c8a922f42cd8a88a6ad7a8af2743c42e78e3eedf.scope: Deactivated successfully. Feb 23 04:58:10 localhost nova_compute[282206]: 2026-02-23 09:58:10.452 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:10 localhost kernel: device tap471e704b-d2 left promiscuous mode Feb 23 04:58:10 localhost ovn_controller[157695]: 2026-02-23T09:58:10Z|00250|binding|INFO|Releasing lport 471e704b-d2af-4ed9-bee8-b3da9da96eee from this chassis (sb_readonly=0) Feb 23 04:58:10 localhost ovn_controller[157695]: 2026-02-23T09:58:10Z|00251|binding|INFO|Setting lport 471e704b-d2af-4ed9-bee8-b3da9da96eee down in Southbound Feb 23 04:58:10 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:10.461 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-53593c6a-4c2a-420a-9472-e7be0052fa39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-53593c6a-4c2a-420a-9472-e7be0052fa39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acbc03f0564045b8857e1689cfa4a66d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d673836f-3caa-47cd-a8d5-bae35d62c0ff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=471e704b-d2af-4ed9-bee8-b3da9da96eee) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:10 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:10.463 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 471e704b-d2af-4ed9-bee8-b3da9da96eee in datapath 53593c6a-4c2a-420a-9472-e7be0052fa39 unbound from our chassis#033[00m Feb 23 04:58:10 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:10.465 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 53593c6a-4c2a-420a-9472-e7be0052fa39 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:58:10 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:10.465 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d67b0d9a-d0bb-4ae4-8d79-611fd22147b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:10 localhost nova_compute[282206]: 2026-02-23 09:58:10.483 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:10 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:10.681 265541 INFO neutron.agent.dhcp.agent [None req-5b205fac-0dac-4ec3-86e3-6be5a69902db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:10 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:10.984 2 INFO neutron.agent.securitygroups_rpc [None req-19ae9284-14e3-4a20-834d-20dede799690 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:11 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:11.126 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:11 localhost systemd[1]: var-lib-containers-storage-overlay-5440e0fd82ce5dacc9263097b3678f8d36f3eeeec3a08ac5ea46b12225532482-merged.mount: Deactivated successfully. Feb 23 04:58:11 localhost systemd[1]: run-netns-qdhcp\x2d53593c6a\x2d4c2a\x2d420a\x2d9472\x2de7be0052fa39.mount: Deactivated successfully. Feb 23 04:58:11 localhost ovn_controller[157695]: 2026-02-23T09:58:11Z|00252|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:58:11 localhost nova_compute[282206]: 2026-02-23 09:58:11.359 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:12 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:12.024 265541 INFO neutron.agent.linux.ip_lib [None req-acbd2cfd-da9a-435e-8746-88eb3178d90b - - - - - -] Device tap591e24ae-b5 cannot be used as it has no MAC address#033[00m Feb 23 04:58:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:12 localhost nova_compute[282206]: 2026-02-23 09:58:12.095 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:12 localhost kernel: device tap591e24ae-b5 entered promiscuous mode Feb 23 04:58:12 localhost NetworkManager[5974]: [1771840692.1031] manager: (tap591e24ae-b5): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Feb 23 04:58:12 localhost nova_compute[282206]: 2026-02-23 09:58:12.102 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:12 localhost ovn_controller[157695]: 2026-02-23T09:58:12Z|00253|binding|INFO|Claiming lport 591e24ae-b5aa-4de2-bb90-f63788f17656 for this chassis. Feb 23 04:58:12 localhost ovn_controller[157695]: 2026-02-23T09:58:12Z|00254|binding|INFO|591e24ae-b5aa-4de2-bb90-f63788f17656: Claiming unknown Feb 23 04:58:12 localhost systemd-udevd[315604]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:58:12 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:12.121 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-59b1761b-6159-4589-b0ab-d24692c6be4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59b1761b-6159-4589-b0ab-d24692c6be4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '983d362fe1064ddd8f80d65a731f1168', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f1210f-d7d4-4dc7-9c80-9ac59d666074, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=591e24ae-b5aa-4de2-bb90-f63788f17656) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:12 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:12.123 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 591e24ae-b5aa-4de2-bb90-f63788f17656 in datapath 59b1761b-6159-4589-b0ab-d24692c6be4a bound to our chassis#033[00m Feb 23 04:58:12 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:12.127 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 59b1761b-6159-4589-b0ab-d24692c6be4a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:58:12 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:12.129 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d6568e78-7528-4be6-8f7f-fbbfad844393]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:12 localhost journal[231253]: ethtool ioctl error on tap591e24ae-b5: No such device Feb 23 04:58:12 localhost journal[231253]: ethtool ioctl error on tap591e24ae-b5: No such device Feb 23 04:58:12 localhost nova_compute[282206]: 2026-02-23 09:58:12.140 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:12 localhost ovn_controller[157695]: 2026-02-23T09:58:12Z|00255|binding|INFO|Setting lport 591e24ae-b5aa-4de2-bb90-f63788f17656 ovn-installed in OVS Feb 23 04:58:12 localhost ovn_controller[157695]: 2026-02-23T09:58:12Z|00256|binding|INFO|Setting lport 591e24ae-b5aa-4de2-bb90-f63788f17656 up in Southbound Feb 23 04:58:12 localhost journal[231253]: ethtool ioctl error on tap591e24ae-b5: No such device Feb 23 04:58:12 localhost journal[231253]: ethtool ioctl error on tap591e24ae-b5: No such device Feb 23 04:58:12 localhost journal[231253]: ethtool ioctl error on tap591e24ae-b5: No such device Feb 23 04:58:12 localhost journal[231253]: ethtool ioctl error on tap591e24ae-b5: No such device Feb 23 04:58:12 localhost journal[231253]: ethtool ioctl error on tap591e24ae-b5: No such device Feb 23 04:58:12 localhost journal[231253]: ethtool ioctl error on tap591e24ae-b5: No such device Feb 23 04:58:12 localhost nova_compute[282206]: 2026-02-23 09:58:12.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:12 localhost nova_compute[282206]: 2026-02-23 09:58:12.209 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e140 do_prune osdmap full prune enabled Feb 23 04:58:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e141 e141: 6 total, 6 up, 6 in Feb 23 04:58:12 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e141: 6 total, 6 up, 6 in Feb 23 04:58:12 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:12.353 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:58:12 localhost dnsmasq[314933]: exiting on receipt of SIGTERM Feb 23 04:58:12 localhost podman[315670]: 2026-02-23 09:58:12.837295506 +0000 UTC m=+0.055628209 container kill 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:58:12 localhost systemd[1]: libpod-61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1.scope: Deactivated successfully. Feb 23 04:58:12 localhost podman[315683]: 2026-02-23 09:58:12.903069198 +0000 UTC m=+0.052706059 container died 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2) Feb 23 04:58:13 localhost systemd[1]: tmp-crun.fXp0zA.mount: Deactivated successfully. Feb 23 04:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:58:13 localhost podman[315683]: 2026-02-23 09:58:13.087913279 +0000 UTC m=+0.237550080 container cleanup 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 04:58:13 localhost systemd[1]: libpod-conmon-61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1.scope: Deactivated successfully. Feb 23 04:58:13 localhost podman[315685]: 2026-02-23 09:58:13.11027948 +0000 UTC m=+0.250722087 container remove 61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c843867f-296c-42fa-9f8c-55712f0f7c56, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:58:13 localhost nova_compute[282206]: 2026-02-23 09:58:13.157 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:13 localhost podman[315728]: 2026-02-23 09:58:13.182919934 +0000 UTC m=+0.113567729 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 04:58:13 localhost podman[315728]: 2026-02-23 09:58:13.193085799 +0000 UTC m=+0.123733594 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_managed=true) Feb 23 04:58:13 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:58:13 localhost podman[315744]: Feb 23 04:58:13 localhost podman[315744]: 2026-02-23 09:58:13.249630465 +0000 UTC m=+0.075153523 container create 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 04:58:13 localhost systemd[1]: Started libpod-conmon-8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1.scope. Feb 23 04:58:13 localhost systemd[1]: Started libcrun container. Feb 23 04:58:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6596b8fd2403688e1c2846251827eb8351db900fadb6ba717052caf8eb93089/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:58:13 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:13.304 265541 INFO neutron.agent.dhcp.agent [None req-ab0e1c05-a52a-4777-94ac-5fda81f8a7db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:13 localhost podman[315744]: 2026-02-23 09:58:13.305988817 +0000 UTC m=+0.131511895 container init 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:58:13 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:13.306 265541 INFO neutron.agent.dhcp.agent [None req-ab0e1c05-a52a-4777-94ac-5fda81f8a7db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:13 localhost podman[315744]: 2026-02-23 09:58:13.318835124 +0000 UTC m=+0.144358202 container start 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:58:13 localhost podman[315744]: 2026-02-23 09:58:13.221458825 +0000 UTC m=+0.046981893 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:58:13 localhost dnsmasq[315769]: started, version 2.85 cachesize 150 Feb 23 04:58:13 localhost dnsmasq[315769]: DNS service limited to local subnets Feb 23 04:58:13 localhost dnsmasq[315769]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:58:13 localhost dnsmasq[315769]: warning: no upstream servers configured Feb 23 04:58:13 localhost dnsmasq-dhcp[315769]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:58:13 localhost dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 0 addresses Feb 23 04:58:13 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host Feb 23 04:58:13 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts Feb 23 04:58:13 localhost openstack_network_exporter[245358]: ERROR 09:58:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:58:13 localhost openstack_network_exporter[245358]: Feb 23 04:58:13 localhost openstack_network_exporter[245358]: ERROR 09:58:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:58:13 localhost openstack_network_exporter[245358]: Feb 23 04:58:13 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:13.451 265541 INFO neutron.agent.dhcp.agent [None req-81980b94-9ebd-460c-b985-f9233238ccc5 - - - - - -] DHCP configuration for ports {'59da67d1-dc5c-4e0b-8d29-1109ee1dfd79'} is completed#033[00m Feb 23 04:58:13 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:13.819 2 INFO neutron.agent.securitygroups_rpc [None req-eb522331-80d1-4b00-bd36-6fd8378962f5 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:13 localhost systemd[1]: var-lib-containers-storage-overlay-72bec57338a9b4c2dc713f40ed6af99e9442fa24f4fd216de691c92203a3e570-merged.mount: Deactivated successfully. Feb 23 04:58:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61844cecbb88e05cd64c9c8c1b4fbab159f87a7bad8c1686f8d1bec04920bec1-userdata-shm.mount: Deactivated successfully. Feb 23 04:58:13 localhost systemd[1]: run-netns-qdhcp\x2dc843867f\x2d296c\x2d42fa\x2d9f8c\x2d55712f0f7c56.mount: Deactivated successfully. Feb 23 04:58:14 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:14.207 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:58:14 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/873803536' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:58:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:58:14 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/873803536' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:58:15 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:15.157 2 INFO neutron.agent.securitygroups_rpc [None req-4d44920c-4a63-4197-972a-c30d277ee529 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:15 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:15.161 2 INFO neutron.agent.securitygroups_rpc [None req-54919a2c-3a58-4a60-9867-0e5c23ba956a 8ff2abb777c74a6dbae4721d46f0d17a 182b0ebb06754cfab10ebabcdf7056ed - - default default] Security group member updated ['c029b069-aec5-44a4-9af0-e58cbf64895c']#033[00m Feb 23 04:58:15 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:15.463 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:15Z, description=, device_id=2eb0a382-819f-43ad-8f50-a7e13025787c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=12133776-1eda-452b-a19f-0bf1cb92b9c1, ip_allocation=immediate, mac_address=fa:16:3e:f6:82:dc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:09Z, description=, dns_domain=, id=59b1761b-6159-4589-b0ab-d24692c6be4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-543621127, port_security_enabled=True, project_id=983d362fe1064ddd8f80d65a731f1168, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58541, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1991, status=ACTIVE, subnets=['dbc5e08e-6d95-4682-93af-5a6ba15e0328'], tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:10Z, vlan_transparent=None, network_id=59b1761b-6159-4589-b0ab-d24692c6be4a, port_security_enabled=False, project_id=983d362fe1064ddd8f80d65a731f1168, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2004, status=DOWN, tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:15Z on network 59b1761b-6159-4589-b0ab-d24692c6be4a#033[00m Feb 23 04:58:15 localhost systemd[1]: tmp-crun.28n2Ib.mount: Deactivated successfully. Feb 23 04:58:15 localhost podman[315787]: 2026-02-23 09:58:15.719822943 +0000 UTC m=+0.064598086 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:58:15 localhost dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 1 addresses Feb 23 04:58:15 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host Feb 23 04:58:15 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts Feb 23 04:58:15 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:15.901 265541 INFO neutron.agent.dhcp.agent [None req-2b0f6ab5-7875-4b16-821f-35040e07a6c9 - - - - - -] DHCP configuration for ports {'12133776-1eda-452b-a19f-0bf1cb92b9c1'} is completed#033[00m Feb 23 04:58:16 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:16.700 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:15Z, description=, device_id=2eb0a382-819f-43ad-8f50-a7e13025787c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=12133776-1eda-452b-a19f-0bf1cb92b9c1, ip_allocation=immediate, mac_address=fa:16:3e:f6:82:dc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:09Z, description=, dns_domain=, id=59b1761b-6159-4589-b0ab-d24692c6be4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-543621127, port_security_enabled=True, project_id=983d362fe1064ddd8f80d65a731f1168, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58541, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1991, status=ACTIVE, subnets=['dbc5e08e-6d95-4682-93af-5a6ba15e0328'], tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:10Z, vlan_transparent=None, network_id=59b1761b-6159-4589-b0ab-d24692c6be4a, port_security_enabled=False, project_id=983d362fe1064ddd8f80d65a731f1168, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2004, status=DOWN, tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:15Z on network 59b1761b-6159-4589-b0ab-d24692c6be4a#033[00m Feb 23 04:58:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:58:16 localhost podman[315809]: 2026-02-23 09:58:16.909940903 +0000 UTC m=+0.086360159 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:16 localhost podman[315809]: 2026-02-23 09:58:16.921329845 +0000 UTC m=+0.097749091 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, tcib_managed=true) Feb 23 04:58:16 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:58:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:17 localhost podman[315842]: 2026-02-23 09:58:17.054529539 +0000 UTC m=+0.058213509 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 23 04:58:17 localhost dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 1 addresses Feb 23 04:58:17 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host Feb 23 04:58:17 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts Feb 23 04:58:17 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:17.307 265541 INFO neutron.agent.dhcp.agent [None req-55a8c3e2-35cc-4d48-aa2d-26993f90f86d - - - - - -] DHCP configuration for ports {'12133776-1eda-452b-a19f-0bf1cb92b9c1'} is completed#033[00m Feb 23 04:58:17 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:17.716 2 INFO neutron.agent.securitygroups_rpc [None req-d1c62e78-6e68-4888-a717-f17406167923 1a9e25d9a0c746578e1b6c457935b6c2 983d362fe1064ddd8f80d65a731f1168 - - default default] Security group member updated ['011ab8d8-354c-4fb1-b0db-21af2eca313e']#033[00m Feb 23 04:58:17 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:17.734 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:17Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=de6b3f04-06ea-414f-8da1-27ddb41e69e1, ip_allocation=immediate, mac_address=fa:16:3e:42:93:58, name=tempest-FloatingIPNegativeTestJSON-1931896205, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:09Z, description=, dns_domain=, id=59b1761b-6159-4589-b0ab-d24692c6be4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-543621127, port_security_enabled=True, project_id=983d362fe1064ddd8f80d65a731f1168, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=58541, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1991, status=ACTIVE, subnets=['dbc5e08e-6d95-4682-93af-5a6ba15e0328'], tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:10Z, vlan_transparent=None, network_id=59b1761b-6159-4589-b0ab-d24692c6be4a, port_security_enabled=True, project_id=983d362fe1064ddd8f80d65a731f1168, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['011ab8d8-354c-4fb1-b0db-21af2eca313e'], standard_attr_id=2021, status=DOWN, tags=[], tenant_id=983d362fe1064ddd8f80d65a731f1168, updated_at=2026-02-23T09:58:17Z on network 59b1761b-6159-4589-b0ab-d24692c6be4a#033[00m Feb 23 04:58:18 localhost systemd[1]: tmp-crun.6T9ZWj.mount: Deactivated successfully. Feb 23 04:58:18 localhost dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 2 addresses Feb 23 04:58:18 localhost podman[315880]: 2026-02-23 09:58:18.015974474 +0000 UTC m=+0.058933772 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:18 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host Feb 23 04:58:18 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts Feb 23 04:58:18 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:18.041 2 INFO neutron.agent.securitygroups_rpc [None req-45386671-0a66-4623-b814-6d3841258b3c 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:18 localhost nova_compute[282206]: 2026-02-23 09:58:18.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 04:58:18 localhost nova_compute[282206]: 2026-02-23 09:58:18.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:18 localhost nova_compute[282206]: 2026-02-23 09:58:18.161 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 04:58:18 localhost nova_compute[282206]: 2026-02-23 09:58:18.162 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:58:18 localhost nova_compute[282206]: 2026-02-23 09:58:18.162 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 04:58:18 localhost nova_compute[282206]: 2026-02-23 09:58:18.165 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:18 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:18.294 265541 INFO neutron.agent.dhcp.agent [None req-422fbba8-ec27-4586-80f8-2bb96cf2b3ca - - - - - -] DHCP configuration for ports {'de6b3f04-06ea-414f-8da1-27ddb41e69e1'} is completed#033[00m Feb 23 04:58:18 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:18.765 2 INFO neutron.agent.securitygroups_rpc [None req-ad743b42-9d5f-46bc-bf0d-b2f432d91b64 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:20 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:20.749 2 INFO neutron.agent.securitygroups_rpc [None req-e6124d39-875a-4fc5-8c30-4d7caf025748 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:21 localhost nova_compute[282206]: 2026-02-23 09:58:21.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:21 localhost nova_compute[282206]: 2026-02-23 09:58:21.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:58:21 localhost nova_compute[282206]: 2026-02-23 09:58:21.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:58:21 localhost nova_compute[282206]: 2026-02-23 09:58:21.157 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:58:21 localhost nova_compute[282206]: 2026-02-23 09:58:21.157 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:58:21 localhost nova_compute[282206]: 2026-02-23 09:58:21.158 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:58:21 localhost nova_compute[282206]: 2026-02-23 09:58:21.158 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:58:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:58:21 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:58:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:21.577 2 INFO neutron.agent.securitygroups_rpc [None req-362b8c0d-6857-4758-a913-b5b9ee733cd1 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:21.618 2 INFO neutron.agent.securitygroups_rpc [None req-12654231-88d0-4565-b196-1dda145060e4 1a9e25d9a0c746578e1b6c457935b6c2 983d362fe1064ddd8f80d65a731f1168 - - default default] Security group member updated ['011ab8d8-354c-4fb1-b0db-21af2eca313e']#033[00m Feb 23 04:58:21 localhost dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 1 addresses Feb 23 04:58:21 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host Feb 23 04:58:21 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts Feb 23 04:58:21 localhost podman[316005]: 2026-02-23 09:58:21.877732545 +0000 UTC m=+0.062749089 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:58:22 localhost nova_compute[282206]: 2026-02-23 09:58:22.031 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:58:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e141 do_prune osdmap full prune enabled Feb 23 04:58:22 localhost nova_compute[282206]: 2026-02-23 09:58:22.053 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:58:22 localhost nova_compute[282206]: 2026-02-23 09:58:22.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:58:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 e142: 6 total, 6 up, 6 in Feb 23 04:58:22 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e142: 6 total, 6 up, 6 in Feb 23 04:58:22 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:58:22 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:58:22 localhost systemd[1]: tmp-crun.sm2aXv.mount: Deactivated successfully. Feb 23 04:58:22 localhost dnsmasq[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/addn_hosts - 0 addresses Feb 23 04:58:22 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/host Feb 23 04:58:22 localhost dnsmasq-dhcp[315769]: read /var/lib/neutron/dhcp/59b1761b-6159-4589-b0ab-d24692c6be4a/opts Feb 23 04:58:22 localhost podman[316042]: 2026-02-23 09:58:22.73629223 +0000 UTC m=+0.070489208 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:58:22 localhost kernel: device tap591e24ae-b5 left promiscuous mode Feb 23 04:58:22 localhost ovn_controller[157695]: 2026-02-23T09:58:22Z|00257|binding|INFO|Releasing lport 591e24ae-b5aa-4de2-bb90-f63788f17656 from this chassis (sb_readonly=0) Feb 23 04:58:22 localhost ovn_controller[157695]: 2026-02-23T09:58:22Z|00258|binding|INFO|Setting lport 591e24ae-b5aa-4de2-bb90-f63788f17656 down in Southbound Feb 23 04:58:22 localhost nova_compute[282206]: 2026-02-23 09:58:22.907 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:22.924 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-59b1761b-6159-4589-b0ab-d24692c6be4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59b1761b-6159-4589-b0ab-d24692c6be4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '983d362fe1064ddd8f80d65a731f1168', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0f1210f-d7d4-4dc7-9c80-9ac59d666074, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=591e24ae-b5aa-4de2-bb90-f63788f17656) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:22.926 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 591e24ae-b5aa-4de2-bb90-f63788f17656 in datapath 59b1761b-6159-4589-b0ab-d24692c6be4a unbound from our chassis#033[00m Feb 23 04:58:22 localhost nova_compute[282206]: 2026-02-23 09:58:22.928 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:22.929 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59b1761b-6159-4589-b0ab-d24692c6be4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:58:22 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:22.930 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b41241db-4cfb-494a-9266-1717f1eba2ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:23 localhost nova_compute[282206]: 2026-02-23 09:58:23.163 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:23 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:23.828 2 INFO neutron.agent.securitygroups_rpc [None req-843b8c0e-9c58-433a-b8a5-d142a0ae4b56 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:24 localhost nova_compute[282206]: 2026-02-23 09:58:24.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:24 localhost nova_compute[282206]: 2026-02-23 09:58:24.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:58:24 localhost systemd[1]: tmp-crun.UimNtR.mount: Deactivated successfully. Feb 23 04:58:24 localhost dnsmasq[315769]: exiting on receipt of SIGTERM Feb 23 04:58:24 localhost podman[316084]: 2026-02-23 09:58:24.345750626 +0000 UTC m=+0.068027182 container kill 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:58:24 localhost systemd[1]: libpod-8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1.scope: Deactivated successfully. Feb 23 04:58:24 localhost podman[316099]: 2026-02-23 09:58:24.416587664 +0000 UTC m=+0.054883116 container died 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:58:24 localhost podman[316099]: 2026-02-23 09:58:24.451342639 +0000 UTC m=+0.089638021 container cleanup 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 23 04:58:24 localhost systemd[1]: libpod-conmon-8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1.scope: Deactivated successfully. Feb 23 04:58:24 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:24.457 2 INFO neutron.agent.securitygroups_rpc [None req-f2783f1b-db17-4abb-b550-2a7eaeb7f1e9 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:24 localhost podman[316100]: 2026-02-23 09:58:24.499441934 +0000 UTC m=+0.134076763 container remove 8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-59b1761b-6159-4589-b0ab-d24692c6be4a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:58:24 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:24.531 265541 INFO neutron.agent.dhcp.agent [None req-75265f1a-11dc-4535-ae70-e8314a9f3b26 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:24 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:24.749 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:25 localhost nova_compute[282206]: 2026-02-23 09:58:25.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:25 localhost ovn_controller[157695]: 2026-02-23T09:58:25Z|00259|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:58:25 localhost nova_compute[282206]: 2026-02-23 09:58:25.099 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:25 localhost systemd[1]: tmp-crun.t2fHr6.mount: Deactivated successfully. Feb 23 04:58:25 localhost systemd[1]: var-lib-containers-storage-overlay-a6596b8fd2403688e1c2846251827eb8351db900fadb6ba717052caf8eb93089-merged.mount: Deactivated successfully. Feb 23 04:58:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ab3b14bd71db46ee1e28e4ce44c5c50d8f5aba2d8c1832a51fb8d6387f4bde1-userdata-shm.mount: Deactivated successfully. Feb 23 04:58:25 localhost systemd[1]: run-netns-qdhcp\x2d59b1761b\x2d6159\x2d4589\x2db0ab\x2dd24692c6be4a.mount: Deactivated successfully. Feb 23 04:58:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:58:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:58:26 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:26.247 2 INFO neutron.agent.securitygroups_rpc [None req-e81a47f4-2a39-4882-ae5e-f110dbf1c96f 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:58:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:58:26 localhost podman[316130]: 2026-02-23 09:58:26.905987156 +0000 UTC m=+0.077839456 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 04:58:26 localhost podman[316130]: 2026-02-23 09:58:26.917487521 +0000 UTC m=+0.089339821 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:58:26 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:58:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:27 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:27.801 2 INFO neutron.agent.securitygroups_rpc [None req-99ddfa8e-0299-435b-a099-c8da64e3d700 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.077 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.077 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.216 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:58:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1889799132' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.584 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.646 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.646 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:58:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.840 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.842 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11359MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.842 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.843 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:58:28 localhost podman[316175]: 2026-02-23 09:58:28.888865019 +0000 UTC m=+0.064684299 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347) Feb 23 04:58:28 localhost podman[316175]: 2026-02-23 09:58:28.902180409 +0000 UTC m=+0.077999719 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, release=1770267347, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9) Feb 23 04:58:28 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.945 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.946 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:58:28 localhost nova_compute[282206]: 2026-02-23 09:58:28.946 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:58:29 localhost nova_compute[282206]: 2026-02-23 09:58:29.004 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:58:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:58:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/591754470' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:58:29 localhost nova_compute[282206]: 2026-02-23 09:58:29.471 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:58:29 localhost nova_compute[282206]: 2026-02-23 09:58:29.478 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:58:29 localhost nova_compute[282206]: 2026-02-23 09:58:29.506 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:58:29 localhost nova_compute[282206]: 2026-02-23 09:58:29.509 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:58:29 localhost nova_compute[282206]: 2026-02-23 09:58:29.509 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:58:29 localhost nova_compute[282206]: 2026-02-23 09:58:29.765 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:31 localhost nova_compute[282206]: 2026-02-23 09:58:31.510 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:31 localhost nova_compute[282206]: 2026-02-23 09:58:31.511 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:58:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2365892405' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:58:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:58:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2365892405' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:58:33 localhost nova_compute[282206]: 2026-02-23 09:58:33.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:58:33 localhost nova_compute[282206]: 2026-02-23 09:58:33.219 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:33.277 265541 INFO neutron.agent.linux.ip_lib [None req-d3ae1095-1940-42af-8546-1ec9cb187347 - - - - - -] Device tapd5a42e1b-50 cannot be used as it has no MAC address#033[00m Feb 23 04:58:33 localhost nova_compute[282206]: 2026-02-23 09:58:33.304 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:33 localhost kernel: device tapd5a42e1b-50 entered promiscuous mode Feb 23 04:58:33 localhost NetworkManager[5974]: [1771840713.3132] manager: (tapd5a42e1b-50): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Feb 23 04:58:33 localhost nova_compute[282206]: 2026-02-23 09:58:33.315 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:33 localhost ovn_controller[157695]: 2026-02-23T09:58:33Z|00260|binding|INFO|Claiming lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 for this chassis. Feb 23 04:58:33 localhost ovn_controller[157695]: 2026-02-23T09:58:33Z|00261|binding|INFO|d5a42e1b-5089-41c4-9d02-d28b44b515d2: Claiming unknown Feb 23 04:58:33 localhost systemd-udevd[316226]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:58:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:33.331 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-ff7aa220-5765-44c6-9121-cfbd718241c5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff7aa220-5765-44c6-9121-cfbd718241c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0421515e6bb54dea8db3ed218999e195', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cbbaed5-c16c-4b6f-96d8-1ef1b1b430f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d5a42e1b-5089-41c4-9d02-d28b44b515d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:33.335 163572 INFO neutron.agent.ovn.metadata.agent [-] Port d5a42e1b-5089-41c4-9d02-d28b44b515d2 in datapath ff7aa220-5765-44c6-9121-cfbd718241c5 bound to our chassis#033[00m Feb 23 04:58:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:33.338 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ff7aa220-5765-44c6-9121-cfbd718241c5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:58:33 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:33.339 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d22036d3-1127-4cea-8b56-defdc6f40853]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:33 localhost journal[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device Feb 23 04:58:33 localhost ovn_controller[157695]: 2026-02-23T09:58:33Z|00262|binding|INFO|Setting lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 ovn-installed in OVS Feb 23 04:58:33 localhost journal[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device Feb 23 04:58:33 localhost ovn_controller[157695]: 2026-02-23T09:58:33Z|00263|binding|INFO|Setting lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 up in Southbound Feb 23 04:58:33 localhost nova_compute[282206]: 2026-02-23 09:58:33.356 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:33 localhost journal[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device Feb 23 04:58:33 localhost journal[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device Feb 23 04:58:33 localhost journal[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device Feb 23 04:58:33 localhost journal[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device Feb 23 04:58:33 localhost journal[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device Feb 23 04:58:33 localhost journal[231253]: ethtool ioctl error on tapd5a42e1b-50: No such device Feb 23 04:58:33 localhost nova_compute[282206]: 2026-02-23 09:58:33.389 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:33 localhost nova_compute[282206]: 2026-02-23 09:58:33.418 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:34 localhost podman[316297]: Feb 23 04:58:34 localhost podman[316297]: 2026-02-23 09:58:34.237039273 +0000 UTC m=+0.093965075 container create e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 04:58:34 localhost systemd[1]: Started libpod-conmon-e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf.scope. Feb 23 04:58:34 localhost podman[316297]: 2026-02-23 09:58:34.191409323 +0000 UTC m=+0.048335175 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:58:34 localhost systemd[1]: Started libcrun container. Feb 23 04:58:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0a4908f0bc9328306b516915e2de85425e40798a526663541b1416ff04dc528a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:58:34 localhost podman[316297]: 2026-02-23 09:58:34.315895009 +0000 UTC m=+0.172820821 container init e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0) Feb 23 04:58:34 localhost podman[316297]: 2026-02-23 09:58:34.32499541 +0000 UTC m=+0.181921212 container start e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS) Feb 23 04:58:34 localhost dnsmasq[316315]: started, version 2.85 cachesize 150 Feb 23 04:58:34 localhost dnsmasq[316315]: DNS service limited to local subnets Feb 23 04:58:34 localhost dnsmasq[316315]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:58:34 localhost dnsmasq[316315]: warning: no upstream servers configured Feb 23 04:58:34 localhost dnsmasq-dhcp[316315]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:58:34 localhost dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 0 addresses Feb 23 04:58:34 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host Feb 23 04:58:34 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts Feb 23 04:58:34 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:34.559 265541 INFO neutron.agent.dhcp.agent [None req-47199348-c40d-463e-b620-75c60ffc1f48 - - - - - -] DHCP configuration for ports {'d908999e-0d8c-4805-bfe5-8996d9567d4b'} is completed#033[00m Feb 23 04:58:35 localhost nova_compute[282206]: 2026-02-23 09:58:35.172 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:58:36 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1157746556' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:58:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:58:36 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1157746556' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:58:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:37 localhost nova_compute[282206]: 2026-02-23 09:58:37.321 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:38 localhost nova_compute[282206]: 2026-02-23 09:58:38.222 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:58:38 localhost podman[316316]: 2026-02-23 09:58:38.916436244 +0000 UTC m=+0.089844467 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:38 localhost systemd[1]: tmp-crun.Il6Uzl.mount: Deactivated successfully. Feb 23 04:58:38 localhost podman[316317]: 2026-02-23 09:58:38.989639376 +0000 UTC m=+0.157031123 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 04:58:39 localhost podman[316317]: 2026-02-23 09:58:39.021681676 +0000 UTC m=+0.189073403 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:58:39 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:58:39 localhost podman[316316]: 2026-02-23 09:58:39.079588574 +0000 UTC m=+0.252996737 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 23 04:58:39 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:58:39 localhost podman[242954]: time="2026-02-23T09:58:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:58:39 localhost podman[242954]: @ - - [23/Feb/2026:09:58:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1" Feb 23 04:58:39 localhost podman[242954]: @ - - [23/Feb/2026:09:58:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19284 "" "Go-http-client/1.1" Feb 23 04:58:39 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:39.699 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:38Z, description=, device_id=c42435b0-1c04-411a-b921-da46209bd2fe, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=149dde58-b9e7-4644-99ab-581857e961de, ip_allocation=immediate, mac_address=fa:16:3e:86:80:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:30Z, description=, dns_domain=, id=ff7aa220-5765-44c6-9121-cfbd718241c5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2004318845-network, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2127, status=ACTIVE, subnets=['8e5f5052-1626-4168-ae6d-3107f2c16e7a'], tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:32Z, vlan_transparent=None, network_id=ff7aa220-5765-44c6-9121-cfbd718241c5, port_security_enabled=False, project_id=0421515e6bb54dea8db3ed218999e195, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2185, status=DOWN, tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:39Z on network ff7aa220-5765-44c6-9121-cfbd718241c5#033[00m Feb 23 04:58:39 localhost dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 1 addresses Feb 23 04:58:39 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host Feb 23 04:58:39 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts Feb 23 04:58:39 localhost podman[316381]: 2026-02-23 09:58:39.918929466 +0000 UTC m=+0.061896503 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 04:58:40 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:40.161 265541 INFO neutron.agent.dhcp.agent [None req-7ac5cd57-3586-4f5d-8634-63691d8f7931 - - - - - -] DHCP configuration for ports {'149dde58-b9e7-4644-99ab-581857e961de'} is completed#033[00m Feb 23 04:58:40 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:40.903 265541 INFO neutron.agent.linux.ip_lib [None req-a0dde59b-31ed-4f7b-8c73-25dda12e803d - - - - - -] Device tap99eeaa57-51 cannot be used as it has no MAC address#033[00m Feb 23 04:58:40 localhost nova_compute[282206]: 2026-02-23 09:58:40.927 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:40 localhost kernel: device tap99eeaa57-51 entered promiscuous mode Feb 23 04:58:40 localhost ovn_controller[157695]: 2026-02-23T09:58:40Z|00264|binding|INFO|Claiming lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 for this chassis. Feb 23 04:58:40 localhost NetworkManager[5974]: [1771840720.9334] manager: (tap99eeaa57-51): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Feb 23 04:58:40 localhost ovn_controller[157695]: 2026-02-23T09:58:40Z|00265|binding|INFO|99eeaa57-5103-4a27-8cb6-f740d5ffcf80: Claiming unknown Feb 23 04:58:40 localhost nova_compute[282206]: 2026-02-23 09:58:40.933 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:40 localhost systemd-udevd[316413]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:58:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:40.945 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-fd854bec-4386-47ab-bc93-a08354b81ab6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd854bec-4386-47ab-bc93-a08354b81ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a6c1ed33b7a401e921451e25668daed', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8ac0cd8-62b1-44f3-b0a9-7a358af2ef4f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=99eeaa57-5103-4a27-8cb6-f740d5ffcf80) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:40.947 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 in datapath fd854bec-4386-47ab-bc93-a08354b81ab6 bound to our chassis#033[00m Feb 23 04:58:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:40.949 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fd854bec-4386-47ab-bc93-a08354b81ab6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:58:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:40.951 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[33d2122f-10ea-42f5-a681-46aff8deb8f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:40 localhost journal[231253]: ethtool ioctl error on tap99eeaa57-51: No such device Feb 23 04:58:40 localhost ovn_controller[157695]: 2026-02-23T09:58:40Z|00266|binding|INFO|Setting lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 ovn-installed in OVS Feb 23 04:58:40 localhost ovn_controller[157695]: 2026-02-23T09:58:40Z|00267|binding|INFO|Setting lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 up in Southbound Feb 23 04:58:40 localhost nova_compute[282206]: 2026-02-23 09:58:40.968 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:40 localhost journal[231253]: ethtool ioctl error on tap99eeaa57-51: No such device Feb 23 04:58:40 localhost journal[231253]: ethtool ioctl error on tap99eeaa57-51: No such device Feb 23 04:58:40 localhost journal[231253]: ethtool ioctl error on tap99eeaa57-51: No such device Feb 23 04:58:40 localhost journal[231253]: ethtool ioctl error on tap99eeaa57-51: No such device Feb 23 04:58:40 localhost journal[231253]: ethtool ioctl error on tap99eeaa57-51: No such device Feb 23 04:58:41 localhost journal[231253]: ethtool ioctl error on tap99eeaa57-51: No such device Feb 23 04:58:41 localhost journal[231253]: ethtool ioctl error on tap99eeaa57-51: No such device Feb 23 04:58:41 localhost nova_compute[282206]: 2026-02-23 09:58:41.011 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:41 localhost nova_compute[282206]: 2026-02-23 09:58:41.039 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:41 localhost podman[316484]: Feb 23 04:58:41 localhost podman[316484]: 2026-02-23 09:58:41.954430084 +0000 UTC m=+0.095259224 container create 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:58:42 localhost systemd[1]: Started libpod-conmon-09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137.scope. Feb 23 04:58:42 localhost podman[316484]: 2026-02-23 09:58:41.909546218 +0000 UTC m=+0.050375418 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:58:42 localhost systemd[1]: Started libcrun container. Feb 23 04:58:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b4d99b1f623479e08c1ba172833c2002e032a9da0c51ca920aa6a9e5f04cbc7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:58:42 localhost podman[316484]: 2026-02-23 09:58:42.03552608 +0000 UTC m=+0.176355220 container init 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS) Feb 23 04:58:42 localhost podman[316484]: 2026-02-23 09:58:42.045752475 +0000 UTC m=+0.186581615 container start 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:58:42 localhost dnsmasq[316503]: started, version 2.85 cachesize 150 Feb 23 04:58:42 localhost dnsmasq[316503]: DNS service limited to local subnets Feb 23 04:58:42 localhost dnsmasq[316503]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:58:42 localhost dnsmasq[316503]: warning: no upstream servers configured Feb 23 04:58:42 localhost dnsmasq-dhcp[316503]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:58:42 localhost dnsmasq[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/addn_hosts - 0 addresses Feb 23 04:58:42 localhost dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/host Feb 23 04:58:42 localhost dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/opts Feb 23 04:58:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:42 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:42.245 265541 INFO neutron.agent.dhcp.agent [None req-4a7eaf64-14f3-44eb-9e50-50827eb5357e - - - - - -] DHCP configuration for ports {'3e53d539-7a24-4706-8d58-d2ed9e960162'} is completed#033[00m Feb 23 04:58:42 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:42.707 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:38Z, description=, device_id=c42435b0-1c04-411a-b921-da46209bd2fe, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=149dde58-b9e7-4644-99ab-581857e961de, ip_allocation=immediate, mac_address=fa:16:3e:86:80:c6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:30Z, description=, dns_domain=, id=ff7aa220-5765-44c6-9121-cfbd718241c5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2004318845-network, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2127, status=ACTIVE, subnets=['8e5f5052-1626-4168-ae6d-3107f2c16e7a'], tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:32Z, vlan_transparent=None, network_id=ff7aa220-5765-44c6-9121-cfbd718241c5, port_security_enabled=False, project_id=0421515e6bb54dea8db3ed218999e195, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2185, status=DOWN, tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:39Z on network ff7aa220-5765-44c6-9121-cfbd718241c5#033[00m Feb 23 04:58:42 localhost dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 1 addresses Feb 23 04:58:42 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host Feb 23 04:58:42 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts Feb 23 04:58:42 localhost podman[316521]: 2026-02-23 09:58:42.968815234 +0000 UTC m=+0.062412590 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0) Feb 23 04:58:43 localhost nova_compute[282206]: 2026-02-23 09:58:43.225 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:43.294 265541 INFO neutron.agent.dhcp.agent [None req-1e6d6e7f-5755-4b2f-a32d-639afa4ca577 - - - - - -] DHCP configuration for ports {'149dde58-b9e7-4644-99ab-581857e961de'} is completed#033[00m Feb 23 04:58:43 localhost openstack_network_exporter[245358]: ERROR 09:58:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:58:43 localhost openstack_network_exporter[245358]: Feb 23 04:58:43 localhost openstack_network_exporter[245358]: ERROR 09:58:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:58:43 localhost openstack_network_exporter[245358]: Feb 23 04:58:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:58:43 localhost podman[316542]: 2026-02-23 09:58:43.919923638 +0000 UTC m=+0.087774172 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 23 04:58:43 localhost podman[316542]: 2026-02-23 09:58:43.938431841 +0000 UTC m=+0.106282385 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 23 04:58:43 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:58:45 localhost nova_compute[282206]: 2026-02-23 09:58:45.966 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:46 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:46.745 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:46Z, description=, device_id=c5c52445-4186-4d74-aba9-70f654d51933, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1a2c6d24-642f-408f-84d0-5d51e879d5dc, ip_allocation=immediate, mac_address=fa:16:3e:ab:19:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:36Z, description=, dns_domain=, id=fd854bec-4386-47ab-bc93-a08354b81ab6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-716275198-network, port_security_enabled=True, project_id=6a6c1ed33b7a401e921451e25668daed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59549, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2181, status=ACTIVE, subnets=['1cbee800-b3d4-414b-b23f-90c05ceb0493'], tags=[], tenant_id=6a6c1ed33b7a401e921451e25668daed, updated_at=2026-02-23T09:58:39Z, vlan_transparent=None, network_id=fd854bec-4386-47ab-bc93-a08354b81ab6, port_security_enabled=False, project_id=6a6c1ed33b7a401e921451e25668daed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2232, status=DOWN, tags=[], tenant_id=6a6c1ed33b7a401e921451e25668daed, updated_at=2026-02-23T09:58:46Z on network fd854bec-4386-47ab-bc93-a08354b81ab6#033[00m Feb 23 04:58:46 localhost podman[316578]: 2026-02-23 09:58:46.994318224 +0000 UTC m=+0.064350689 container kill 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:58:46 localhost dnsmasq[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/addn_hosts - 1 addresses Feb 23 04:58:47 localhost dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/host Feb 23 04:58:47 localhost dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/opts Feb 23 04:58:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:58:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:47 localhost systemd[1]: tmp-crun.XTueAG.mount: Deactivated successfully. Feb 23 04:58:47 localhost podman[316593]: 2026-02-23 09:58:47.1233449 +0000 UTC m=+0.103726525 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:58:47 localhost podman[316593]: 2026-02-23 09:58:47.156388522 +0000 UTC m=+0.136770177 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:47 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:58:47 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:47.271 265541 INFO neutron.agent.dhcp.agent [None req-0fe33b19-dda4-4592-a150-29072e1cc8d8 - - - - - -] DHCP configuration for ports {'1a2c6d24-642f-408f-84d0-5d51e879d5dc'} is completed#033[00m Feb 23 04:58:48 localhost nova_compute[282206]: 2026-02-23 09:58:48.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:48 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:48.293 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:46Z, description=, device_id=c5c52445-4186-4d74-aba9-70f654d51933, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1a2c6d24-642f-408f-84d0-5d51e879d5dc, ip_allocation=immediate, mac_address=fa:16:3e:ab:19:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:36Z, description=, dns_domain=, id=fd854bec-4386-47ab-bc93-a08354b81ab6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-716275198-network, port_security_enabled=True, project_id=6a6c1ed33b7a401e921451e25668daed, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59549, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2181, status=ACTIVE, subnets=['1cbee800-b3d4-414b-b23f-90c05ceb0493'], tags=[], tenant_id=6a6c1ed33b7a401e921451e25668daed, updated_at=2026-02-23T09:58:39Z, vlan_transparent=None, network_id=fd854bec-4386-47ab-bc93-a08354b81ab6, port_security_enabled=False, project_id=6a6c1ed33b7a401e921451e25668daed, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2232, status=DOWN, tags=[], tenant_id=6a6c1ed33b7a401e921451e25668daed, updated_at=2026-02-23T09:58:46Z on network fd854bec-4386-47ab-bc93-a08354b81ab6#033[00m Feb 23 04:58:48 localhost dnsmasq[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/addn_hosts - 1 addresses Feb 23 04:58:48 localhost podman[316636]: 2026-02-23 09:58:48.531442974 +0000 UTC m=+0.060927133 container kill 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:58:48 localhost dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/host Feb 23 04:58:48 localhost dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/opts Feb 23 04:58:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:48.558 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:58:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:48.559 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:58:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:48.560 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:58:48 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:48.775 265541 INFO neutron.agent.dhcp.agent [None req-501debf8-4dac-46d2-a51e-8d2330460ac6 - - - - - -] DHCP configuration for ports {'1a2c6d24-642f-408f-84d0-5d51e879d5dc'} is completed#033[00m Feb 23 04:58:48 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:48.888 265541 INFO neutron.agent.linux.ip_lib [None req-0693f31f-1e33-4e13-93a5-4bf59a810c38 - - - - - -] Device tap244c57bd-d2 cannot be used as it has no MAC address#033[00m Feb 23 04:58:48 localhost nova_compute[282206]: 2026-02-23 09:58:48.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:48 localhost kernel: device tap244c57bd-d2 entered promiscuous mode Feb 23 04:58:48 localhost ovn_controller[157695]: 2026-02-23T09:58:48Z|00268|binding|INFO|Claiming lport 244c57bd-d22b-4733-8be3-0ce88383151e for this chassis. Feb 23 04:58:48 localhost ovn_controller[157695]: 2026-02-23T09:58:48Z|00269|binding|INFO|244c57bd-d22b-4733-8be3-0ce88383151e: Claiming unknown Feb 23 04:58:48 localhost NetworkManager[5974]: [1771840728.9207] manager: (tap244c57bd-d2): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Feb 23 04:58:48 localhost nova_compute[282206]: 2026-02-23 09:58:48.924 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:48 localhost systemd-udevd[316668]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:58:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:48.940 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-169a9bd5-a623-4b9d-83a6-bf3f6708a358', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-169a9bd5-a623-4b9d-83a6-bf3f6708a358', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f70b6503-965d-4883-bf00-ea5f7a873818, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=244c57bd-d22b-4733-8be3-0ce88383151e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:48.942 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 244c57bd-d22b-4733-8be3-0ce88383151e in datapath 169a9bd5-a623-4b9d-83a6-bf3f6708a358 bound to our chassis#033[00m Feb 23 04:58:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:48.945 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 169a9bd5-a623-4b9d-83a6-bf3f6708a358 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:58:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:48.945 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[0a8d18cb-975c-4bf5-a185-bfb454bb6e98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:48 localhost journal[231253]: ethtool ioctl error on tap244c57bd-d2: No such device Feb 23 04:58:48 localhost ovn_controller[157695]: 2026-02-23T09:58:48Z|00270|binding|INFO|Setting lport 244c57bd-d22b-4733-8be3-0ce88383151e ovn-installed in OVS Feb 23 04:58:48 localhost ovn_controller[157695]: 2026-02-23T09:58:48Z|00271|binding|INFO|Setting lport 244c57bd-d22b-4733-8be3-0ce88383151e up in Southbound Feb 23 04:58:48 localhost nova_compute[282206]: 2026-02-23 09:58:48.957 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:48 localhost nova_compute[282206]: 2026-02-23 09:58:48.958 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:48 localhost journal[231253]: ethtool ioctl error on tap244c57bd-d2: No such device Feb 23 04:58:48 localhost journal[231253]: ethtool ioctl error on tap244c57bd-d2: No such device Feb 23 04:58:48 localhost journal[231253]: ethtool ioctl error on tap244c57bd-d2: No such device Feb 23 04:58:48 localhost journal[231253]: ethtool ioctl error on tap244c57bd-d2: No such device Feb 23 04:58:48 localhost journal[231253]: ethtool ioctl error on tap244c57bd-d2: No such device Feb 23 04:58:48 localhost journal[231253]: ethtool ioctl error on tap244c57bd-d2: No such device Feb 23 04:58:48 localhost journal[231253]: ethtool ioctl error on tap244c57bd-d2: No such device Feb 23 04:58:49 localhost nova_compute[282206]: 2026-02-23 09:58:49.006 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:49 localhost nova_compute[282206]: 2026-02-23 09:58:49.039 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:50 localhost podman[316739]: Feb 23 04:58:50 localhost podman[316739]: 2026-02-23 09:58:50.12474214 +0000 UTC m=+0.088238017 container create 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:58:50 localhost systemd[1]: Started libpod-conmon-99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576.scope. Feb 23 04:58:50 localhost podman[316739]: 2026-02-23 09:58:50.081994649 +0000 UTC m=+0.045490586 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:58:50 localhost systemd[1]: Started libcrun container. Feb 23 04:58:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/74cfcad24671a33e4fe137c8b775f0c8c7b0e5f093f9846bc6f205eac1c88d45/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:58:50 localhost podman[316739]: 2026-02-23 09:58:50.209728586 +0000 UTC m=+0.173224463 container init 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:58:50 localhost podman[316739]: 2026-02-23 09:58:50.219032253 +0000 UTC m=+0.182528130 container start 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:58:50 localhost dnsmasq[316758]: started, version 2.85 cachesize 150 Feb 23 04:58:50 localhost dnsmasq[316758]: DNS service limited to local subnets Feb 23 04:58:50 localhost dnsmasq[316758]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:58:50 localhost dnsmasq[316758]: warning: no upstream servers configured Feb 23 04:58:50 localhost dnsmasq-dhcp[316758]: DHCP, static leases only on 10.103.0.0, lease time 1d Feb 23 04:58:50 localhost dnsmasq[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/addn_hosts - 0 addresses Feb 23 04:58:50 localhost dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/host Feb 23 04:58:50 localhost dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/opts Feb 23 04:58:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:50.425 265541 INFO neutron.agent.dhcp.agent [None req-346905fc-8667-42f4-96f4-e0f0e63b898a - - - - - -] DHCP configuration for ports {'d5e8428f-bde0-41bf-9c99-6da36b6352c2'} is completed#033[00m Feb 23 04:58:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:51.391 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:51.393 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:58:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:51.397 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:58:51 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:51.398 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[742bb80e-ef74-42c4-ad92-d46f92e656fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:51 localhost sshd[316759]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:58:51 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:51.457 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:50Z, description=, device_id=cfa27e09-aa47-4679-8e70-2bef8c0fc3b1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=da6ff8ec-6264-4847-beac-d4de76d2b819, ip_allocation=immediate, mac_address=fa:16:3e:fd:a2:ba, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:45Z, description=, dns_domain=, id=169a9bd5-a623-4b9d-83a6-bf3f6708a358, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1315899538, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=65364, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2230, status=ACTIVE, subnets=['3a493491-cb34-442e-b652-e18d508e6e1e'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:58:47Z, vlan_transparent=None, network_id=169a9bd5-a623-4b9d-83a6-bf3f6708a358, port_security_enabled=False, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2249, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:58:51Z on network 169a9bd5-a623-4b9d-83a6-bf3f6708a358#033[00m Feb 23 04:58:51 localhost dnsmasq[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/addn_hosts - 1 addresses Feb 23 04:58:51 localhost dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/host Feb 23 04:58:51 localhost dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/opts Feb 23 04:58:51 localhost podman[316778]: 2026-02-23 09:58:51.704758165 +0000 UTC m=+0.057411844 container kill 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 23 04:58:52 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:52.006 265541 INFO neutron.agent.dhcp.agent [None req-7a20fdea-f7d6-4b00-9145-d1af8673ce12 - - - - - -] DHCP configuration for ports {'da6ff8ec-6264-4847-beac-d4de76d2b819'} is completed#033[00m Feb 23 04:58:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:52 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:52.187 2 INFO neutron.agent.securitygroups_rpc [None req-a57c617c-c4ee-4d55-b7f0-53311658d2fd 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:52.418 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:52.420 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:58:52 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:52.421 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:58:52 localhost nova_compute[282206]: 2026-02-23 09:58:52.461 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:53 localhost nova_compute[282206]: 2026-02-23 09:58:53.278 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:58:53 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3806430089' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:58:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:58:53 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3806430089' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:58:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:53.559 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:58:50Z, description=, device_id=cfa27e09-aa47-4679-8e70-2bef8c0fc3b1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=da6ff8ec-6264-4847-beac-d4de76d2b819, ip_allocation=immediate, mac_address=fa:16:3e:fd:a2:ba, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:45Z, description=, dns_domain=, id=169a9bd5-a623-4b9d-83a6-bf3f6708a358, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1315899538, port_security_enabled=True, project_id=6aadd525d3dd402cb701922115d00291, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=65364, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2230, status=ACTIVE, subnets=['3a493491-cb34-442e-b652-e18d508e6e1e'], tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:58:47Z, vlan_transparent=None, network_id=169a9bd5-a623-4b9d-83a6-bf3f6708a358, port_security_enabled=False, project_id=6aadd525d3dd402cb701922115d00291, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2249, status=DOWN, tags=[], tenant_id=6aadd525d3dd402cb701922115d00291, updated_at=2026-02-23T09:58:51Z on network 169a9bd5-a623-4b9d-83a6-bf3f6708a358#033[00m Feb 23 04:58:53 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:53.586 2 INFO neutron.agent.securitygroups_rpc [None req-bc2e33ea-0cb8-4312-bf00-811550151f9a 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:53 localhost dnsmasq[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/addn_hosts - 1 addresses Feb 23 04:58:53 localhost dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/host Feb 23 04:58:53 localhost podman[316816]: 2026-02-23 09:58:53.8139679 +0000 UTC m=+0.060674606 container kill 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:58:53 localhost dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/opts Feb 23 04:58:54 localhost nova_compute[282206]: 2026-02-23 09:58:54.061 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:54 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:54.105 265541 INFO neutron.agent.dhcp.agent [None req-feeb937f-ece0-4b1b-91b1-cfc429dec68b - - - - - -] DHCP configuration for ports {'da6ff8ec-6264-4847-beac-d4de76d2b819'} is completed#033[00m Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.151 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.157 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f984c4e8-b3ca-42e5-8ac6-b8996d34ad92', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.152826', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4447247c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': 'dca78750f935e2471be500abbc7563f127c6bbd723446997d7ca144886e02515'}]}, 'timestamp': '2026-02-23 09:58:56.159746', '_unique_id': '5100e6f474ad43f3b3a5e198e98deeeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.162 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.196 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.197 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5e62cdf-6e36-44d4-97d3-dc3e9ea97823', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.164549', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444cf974-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '04e7249fe5a9f96a8dc29d943288b122786d01b424b40410fde1e035804401e8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.164549', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444d0752-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'ad8235436fb1eff9f50584497938b6d5e2d6168c1fae94f46e0a881add380c68'}]}, 'timestamp': '2026-02-23 09:58:56.197344', '_unique_id': 'd1c70a1bbfa6463c8ad09f6a36a4162c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.198 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a07e759-3c80-4b14-8917-bf2010a02a0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.199390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444d6292-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '3e7849fb1959627cfb6c572a705e5bfb8f59187eed237f933f8edb84413e31bd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.199390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444d7494-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'ef568f22599a855d569fdbc0b6de88f53c0760314a6769b67d524ccb73cebbde'}]}, 'timestamp': '2026-02-23 09:58:56.200131', '_unique_id': 'dbe2215518d54ac68725b66ac163aa4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.201 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.202 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c812296c-6285-4143-aeac-fcbb87e11a9c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.201663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444dbac6-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'b170db6bdaaedca6c9e449a77091700248bbe32634bed7583591170607a4b986'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.201663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444dcc46-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '37c4fffb19a085e76f5ec3fa1d0d7a44b139704b774ecc86cf44b36f0b099ef4'}]}, 'timestamp': '2026-02-23 09:58:56.202373', '_unique_id': '36a198a83de04422815e4c2cb093c8d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.203 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60d31a62-345d-4652-b9c7-f1362b3219da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.203809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444e0f9e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'dd960f0e86c00c7f38afd7a34df990a52122544dbb0ac8ed617e657f66cbcc3c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.203809', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444e2088-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'a7111064de063ed32ea9ad503d9866291040d0d733b291f9445bdd679bf4d283'}]}, 'timestamp': '2026-02-23 09:58:56.204533', '_unique_id': '25cb8dbcfc0441fba226fdb89d37fcdd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.205 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.206 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.206 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6273096-642d-438f-9ac0-5ac5364c33af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.206249', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '444e6e30-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'dfc45d44a4fe631826f503e3b0814017d1db605c601a3c0741f6a1f2efe1d652'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.206249', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '444e7830-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '9d136adf2d0338743bd4c16baa78ec5017129f6c592876f1eab91aaeeb704ee6'}]}, 'timestamp': '2026-02-23 09:58:56.206970', '_unique_id': '4e6ed5ed46d34b7db834f2df7f5a53f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.207 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb7599b7-43b3-4b93-ac75-35d7fe95b782', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.208703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '44509476-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'b3e0a67b4b17b16e7d326d34e614c1e7829906751495dadf66369b6a4bdce73b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.208703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '44509f16-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'bd47571e80eb4c6a25deb9d80fb05ad701f75f1d022b906c596cf46b71d398e3'}]}, 'timestamp': '2026-02-23 09:58:56.220941', '_unique_id': '8ddfe8c72a884241a13dc6146633c451'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.222 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44704524-4428-4d86-a7f8-c969e9b8ffab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.222362', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4450e3ae-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '5f8bb649c3d2181118df065add0bff854f0211e9cc911ba9d4c3123c735e2611'}]}, 'timestamp': '2026-02-23 09:58:56.222651', '_unique_id': '73186567fc4e4711896b76e58b195889'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.224 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd605ae7-2f80-4a2c-a64e-c43f8dd48a14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.224038', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '445124fe-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '13c47d2ad3b01937c3b38c69ffc91372cbdfe52772454433092939f44057ca87'}]}, 'timestamp': '2026-02-23 09:58:56.224374', '_unique_id': 'aecccd5384aa4a328f14fab06c600c50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.225 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f01f88e-f48a-4dff-ab13-54e0f36a61af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.225784', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '44516a4a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '93129dec73ebf183a43e8da223a73d08304d86b5c490667f9460f404093c01ee'}]}, 'timestamp': '2026-02-23 09:58:56.226098', '_unique_id': '7c1dfbd0203947abb36d3cfa1944811c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.226 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f19c06b-8a02-4f4b-91da-d4d8c8877bd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.227594', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4451b0ea-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '6ac4c5b4cc85b88d1465fe9e6c4408830302cae4bdab036612e08a03a2605060'}]}, 'timestamp': '2026-02-23 09:58:56.227932', '_unique_id': '0f92354c54c8465ca834f08799568ebb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74f8a78c-0287-4712-9463-7e031de5226c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.229343', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4451f5dc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '1d7c9f25b611ad00eea6f34a1a160fafb36599d2baa381f2b6fc30a1cea5e8d8'}]}, 'timestamp': '2026-02-23 09:58:56.229673', '_unique_id': '470dad61c532430381306ffff8a9b98f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.231 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50463f94-82d3-4e14-8347-38a427e157e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.231261', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '44523efc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': 'd4824f254686d541fe018324a3bc4ac5e8533ec6901c5d08f4c61c649a46cc0f'}]}, 'timestamp': '2026-02-23 09:58:56.231644', '_unique_id': 'c428a30b27e7442780a2aef7767f27a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdee3c1d-81ed-46e5-8ee6-e87113edb262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.233155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '44528a38-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': 'f24e208638724fcd4d8ef784e95e41b088d31b6a25efded163bcdc8f68c8d655'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.233155', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '44529582-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.354036264, 'message_signature': '9e98305f6bfdf473645264618e4db59be256606d6eb8f0276a8ed69eff059f77'}]}, 'timestamp': '2026-02-23 09:58:56.233738', '_unique_id': '21b9477fdcf84556b69b8637ae90a551'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.235 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd71e00df-a4df-4800-83df-4d4d2a615bfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.236052', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '4452fb1c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': 'f2ec4afeb1c8ff9f29dabf6a6a1360fa73276d9dae1980724de598301a577e83'}]}, 'timestamp': '2026-02-23 09:58:56.236360', '_unique_id': '90018c81ec22449792b0ebfd63adadd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 14280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2f88f5f-efc8-4705-855b-ccf31cbac0b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14280000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:58:56.237692', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '44569ef2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.449263496, 'message_signature': '71ba07e11bc4b343f0a50627bc8a10a6deea2b0ff95e80bf920b546dfae0f576'}]}, 'timestamp': '2026-02-23 09:58:56.260211', '_unique_id': 'edd077817958429d8fe3af0d1409b73a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.261 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5e1fbc8-cb28-4018-9f09-328b20697ad2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.261803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4456e902-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'cbee7d8f15be757cf6baae76c6345938764a47eb7ae146583d75edb27003d67d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.261803', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4456f5be-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'b412034e9683c124ec6715ecc607792b13e92822b2c645eae244a1f859fa985d'}]}, 'timestamp': '2026-02-23 09:58:56.262452', '_unique_id': 'fd77681d6ae545f7b53a5ca68f2f9855'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.264 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.264 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fea9d0e-5a81-4731-861a-77e9c1c14673', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.264156', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '44574762-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '6c6ac233e447a174253df767b9808abbd00d232946c64894b6e5cd8d1228f313'}]}, 'timestamp': '2026-02-23 09:58:56.264556', '_unique_id': 'c63290e0355b43db8bc61c9efa1ce1fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a7e0b21-5c37-4ef4-8856-816ae61dc4a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T09:58:56.266111', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '445790c8-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.342317522, 'message_signature': '18cfe4da80f67ffeb405dfdaf18aa7c9dab651d1a011a0baa817769634dba447'}]}, 'timestamp': '2026-02-23 09:58:56.266404', '_unique_id': 'b7f2fc1e5f354320a5ea72d701d496bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.267 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67ca4aac-0ea6-4ee9-b638-0e860a24c0a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T09:58:56.268005', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4457dc4a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.449263496, 'message_signature': 'a62342408bdf7b01d8c1b6e22d2b419bc1e51d9fb1ff5599bee0ae6b43b969ea'}]}, 'timestamp': '2026-02-23 09:58:56.268352', '_unique_id': '5fa038e51b5c4167a0b33acf73b2dcbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.269 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.270 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8c01e31-ad01-4a3a-9784-b39717a61af9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T09:58:56.269850', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '44582330-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': '2b09d88f0b15d39696557d48cb0a9ac7a3114d9a148f284006ff2b53e4d12115'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T09:58:56.269850', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '44582f1a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12176.398189878, 'message_signature': 'b1b5e63ea4f8a536274ca6541a7c6728e8e6bd99cd1e4febfc0526c4ff996175'}]}, 'timestamp': '2026-02-23 09:58:56.270438', '_unique_id': '7e3a38aaf9d149a08f6024be2d5f9e94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging yield Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 04:58:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 09:58:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 04:58:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:58:57 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:57.182 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:57 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:57.184 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:58:57 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:57.188 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:58:57 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:57.189 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[142ca4f5-6cbc-40b8-85da-d8290640d8c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:58:57 localhost systemd[1]: tmp-crun.25ZlgV.mount: Deactivated successfully. Feb 23 04:58:57 localhost dnsmasq[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/addn_hosts - 0 addresses Feb 23 04:58:57 localhost dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/host Feb 23 04:58:57 localhost podman[316865]: 2026-02-23 09:58:57.939801769 +0000 UTC m=+0.065122822 container kill 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2) Feb 23 04:58:57 localhost dnsmasq-dhcp[316758]: read /var/lib/neutron/dhcp/169a9bd5-a623-4b9d-83a6-bf3f6708a358/opts Feb 23 04:58:57 localhost podman[316849]: 2026-02-23 09:58:57.927164429 +0000 UTC m=+0.100029021 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:58:58 localhost podman[316849]: 2026-02-23 09:58:58.006252313 +0000 UTC m=+0.179116885 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 04:58:58 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:58:58 localhost nova_compute[282206]: 2026-02-23 09:58:58.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:58 localhost ovn_controller[157695]: 2026-02-23T09:58:58Z|00272|binding|INFO|Releasing lport 244c57bd-d22b-4733-8be3-0ce88383151e from this chassis (sb_readonly=0) Feb 23 04:58:58 localhost ovn_controller[157695]: 2026-02-23T09:58:58Z|00273|binding|INFO|Setting lport 244c57bd-d22b-4733-8be3-0ce88383151e down in Southbound Feb 23 04:58:58 localhost kernel: device tap244c57bd-d2 left promiscuous mode Feb 23 04:58:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:58.155 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-169a9bd5-a623-4b9d-83a6-bf3f6708a358', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-169a9bd5-a623-4b9d-83a6-bf3f6708a358', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6aadd525d3dd402cb701922115d00291', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f70b6503-965d-4883-bf00-ea5f7a873818, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=244c57bd-d22b-4733-8be3-0ce88383151e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:58.157 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 244c57bd-d22b-4733-8be3-0ce88383151e in datapath 169a9bd5-a623-4b9d-83a6-bf3f6708a358 unbound from our chassis#033[00m Feb 23 04:58:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:58.161 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 169a9bd5-a623-4b9d-83a6-bf3f6708a358, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:58:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:58.162 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[0be8fc78-4d4d-46df-9a6f-b9f6d1dd0afd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:58 localhost nova_compute[282206]: 2026-02-23 09:58:58.165 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:58 localhost nova_compute[282206]: 2026-02-23 09:58:58.167 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:58.207 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:58:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:58.209 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:58:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:58.213 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:58:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:58:58.214 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5a91fe0c-b703-4a1b-b234-6283153cd7d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:58:58 localhost nova_compute[282206]: 2026-02-23 09:58:58.277 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:58 localhost nova_compute[282206]: 2026-02-23 09:58:58.282 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:58:58 localhost neutron_sriov_agent[258207]: 2026-02-23 09:58:58.833 2 INFO neutron.agent.securitygroups_rpc [None req-d199110d-6878-4918-921e-08b8a8f76fc7 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:58:59 localhost dnsmasq[316758]: exiting on receipt of SIGTERM Feb 23 04:58:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:58:59 localhost systemd[1]: libpod-99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576.scope: Deactivated successfully. Feb 23 04:58:59 localhost podman[316912]: 2026-02-23 09:58:59.351780273 +0000 UTC m=+0.421054249 container kill 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:58:59 localhost podman[316937]: 2026-02-23 09:58:59.4487772 +0000 UTC m=+0.066445494 container died 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:58:59 localhost podman[316924]: 2026-02-23 09:58:59.448679326 +0000 UTC m=+0.083191010 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:58:59 localhost systemd[1]: tmp-crun.PXW1yu.mount: Deactivated successfully. Feb 23 04:58:59 localhost podman[316937]: 2026-02-23 09:58:59.558679436 +0000 UTC m=+0.176347690 container remove 99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-169a9bd5-a623-4b9d-83a6-bf3f6708a358, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:58:59 localhost systemd[1]: libpod-conmon-99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576.scope: Deactivated successfully. Feb 23 04:58:59 localhost podman[316924]: 2026-02-23 09:58:59.578632792 +0000 UTC m=+0.213144426 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container) Feb 23 04:58:59 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:58:59 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:59.683 265541 INFO neutron.agent.dhcp.agent [None req-75f5ce48-1363-4120-bad5-08d52bde641d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:59 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:58:59.684 265541 INFO neutron.agent.dhcp.agent [None req-75f5ce48-1363-4120-bad5-08d52bde641d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:58:59 localhost ovn_controller[157695]: 2026-02-23T09:58:59Z|00274|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:58:59 localhost systemd[1]: var-lib-containers-storage-overlay-74cfcad24671a33e4fe137c8b775f0c8c7b0e5f093f9846bc6f205eac1c88d45-merged.mount: Deactivated successfully. Feb 23 04:58:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99a319c90f4ac1870108199a405d4c36696c9613c78962ecaa9ad4ef6f1bd576-userdata-shm.mount: Deactivated successfully. Feb 23 04:58:59 localhost systemd[1]: run-netns-qdhcp\x2d169a9bd5\x2da623\x2d4b9d\x2d83a6\x2dbf3f6708a358.mount: Deactivated successfully. Feb 23 04:59:00 localhost nova_compute[282206]: 2026-02-23 09:59:00.077 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:00 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:00.147 2 INFO neutron.agent.securitygroups_rpc [None req-378c2f4f-1435-4dd5-9061-db29c2eb645b 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 04:59:00 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1474432843' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 04:59:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:59:01 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1692926290' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:59:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:59:01 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1692926290' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:59:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e142 do_prune osdmap full prune enabled Feb 23 04:59:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e143 e143: 6 total, 6 up, 6 in Feb 23 04:59:01 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e143: 6 total, 6 up, 6 in Feb 23 04:59:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:59:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s#012Cumulative WAL: 10K writes, 2868 syncs, 3.58 writes per sync, written: 0.04 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4956 writes, 16K keys, 4956 commit groups, 1.0 writes per commit group, ingest: 15.78 MB, 0.03 MB/s#012Interval WAL: 4955 writes, 2127 syncs, 2.33 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 04:59:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e143 do_prune osdmap full prune enabled Feb 23 04:59:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e144 e144: 6 total, 6 up, 6 in Feb 23 04:59:02 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e144: 6 total, 6 up, 6 in Feb 23 04:59:02 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:02.898 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:02 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:02.900 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:59:02 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:02.904 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:02 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:02.905 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c648f7d3-d8aa-46c7-bac4-0de58bc6c14d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:03 localhost nova_compute[282206]: 2026-02-23 09:59:03.280 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e144 do_prune osdmap full prune enabled Feb 23 04:59:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e145 e145: 6 total, 6 up, 6 in Feb 23 04:59:03 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e145: 6 total, 6 up, 6 in Feb 23 04:59:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e145 do_prune osdmap full prune enabled Feb 23 04:59:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e146 e146: 6 total, 6 up, 6 in Feb 23 04:59:04 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e146: 6 total, 6 up, 6 in Feb 23 04:59:05 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:05.511 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:05 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:05.513 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:59:05 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:05.516 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:05 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:05.516 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1f29e7fe-73ae-4b1e-9cf3-2335d3a5dd7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 04:59:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 10K writes, 40K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s#012Cumulative WAL: 10K writes, 2814 syncs, 3.67 writes per sync, written: 0.03 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4763 writes, 16K keys, 4763 commit groups, 1.0 writes per commit group, ingest: 13.99 MB, 0.02 MB/s#012Interval WAL: 4763 writes, 2036 syncs, 2.34 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 04:59:06 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:06.660 2 INFO neutron.agent.securitygroups_rpc [None req-662ed715-f0df-4027-a61e-4a9b758296d7 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:07 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:07.921 2 INFO neutron.agent.securitygroups_rpc [None req-8305faf4-3c52-460c-b047-e282e2502f1e 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e146 do_prune osdmap full prune enabled Feb 23 04:59:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e147 e147: 6 total, 6 up, 6 in Feb 23 04:59:08 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e147: 6 total, 6 up, 6 in Feb 23 04:59:08 localhost nova_compute[282206]: 2026-02-23 09:59:08.284 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:08 localhost ovn_controller[157695]: 2026-02-23T09:59:08Z|00275|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:59:08 localhost nova_compute[282206]: 2026-02-23 09:59:08.902 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:09 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:09.077 2 INFO neutron.agent.securitygroups_rpc [None req-0b1bb1a0-ad29-4db2-a82b-e96c70807da2 9903926d083041b9a33881e7cab5b89f c0dc7447f79a422a9af7dbd04780afa6 - - default default] Security group member updated ['abb3c63b-8b38-4dd7-99e4-d8f07472a5d2']#033[00m Feb 23 04:59:09 localhost podman[242954]: time="2026-02-23T09:59:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:59:09 localhost podman[242954]: @ - - [23/Feb/2026:09:59:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160729 "" "Go-http-client/1.1" Feb 23 04:59:09 localhost podman[242954]: @ - - [23/Feb/2026:09:59:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19763 "" "Go-http-client/1.1" Feb 23 04:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:59:09 localhost systemd[1]: tmp-crun.DBMul9.mount: Deactivated successfully. Feb 23 04:59:09 localhost podman[316973]: 2026-02-23 09:59:09.921986465 +0000 UTC m=+0.087284627 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 04:59:09 localhost podman[316973]: 2026-02-23 09:59:09.961601879 +0000 UTC m=+0.126900021 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 04:59:09 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:59:09 localhost podman[316972]: 2026-02-23 09:59:09.974743625 +0000 UTC m=+0.143549725 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true) Feb 23 04:59:10 localhost podman[316972]: 2026-02-23 09:59:10.015335809 +0000 UTC m=+0.184141899 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 04:59:10 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:59:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e147 do_prune osdmap full prune enabled Feb 23 04:59:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e148 e148: 6 total, 6 up, 6 in Feb 23 04:59:10 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e148: 6 total, 6 up, 6 in Feb 23 04:59:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:59:10 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1122435503' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:59:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:59:10 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1122435503' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:59:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:11.304 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:11.306 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:59:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:11.309 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:11.310 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c1ba46cd-3260-496a-b1a2-68d3894fe5d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:11 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:11.517 2 INFO neutron.agent.securitygroups_rpc [None req-3920dd90-663f-4a5e-863e-875f00aeb78d 9903926d083041b9a33881e7cab5b89f c0dc7447f79a422a9af7dbd04780afa6 - - default default] Security group member updated ['abb3c63b-8b38-4dd7-99e4-d8f07472a5d2']#033[00m Feb 23 04:59:11 localhost dnsmasq[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/addn_hosts - 0 addresses Feb 23 04:59:11 localhost dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/host Feb 23 04:59:11 localhost dnsmasq-dhcp[316503]: read /var/lib/neutron/dhcp/fd854bec-4386-47ab-bc93-a08354b81ab6/opts Feb 23 04:59:11 localhost podman[317033]: 2026-02-23 09:59:11.687646466 +0000 UTC m=+0.067650921 container kill 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:59:11 localhost ovn_controller[157695]: 2026-02-23T09:59:11Z|00276|binding|INFO|Releasing lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 from this chassis (sb_readonly=0) Feb 23 04:59:11 localhost nova_compute[282206]: 2026-02-23 09:59:11.888 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:11 localhost kernel: device tap99eeaa57-51 left promiscuous mode Feb 23 04:59:11 localhost ovn_controller[157695]: 2026-02-23T09:59:11Z|00277|binding|INFO|Setting lport 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 down in Southbound Feb 23 04:59:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:11.896 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-fd854bec-4386-47ab-bc93-a08354b81ab6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd854bec-4386-47ab-bc93-a08354b81ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6a6c1ed33b7a401e921451e25668daed', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c8ac0cd8-62b1-44f3-b0a9-7a358af2ef4f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=99eeaa57-5103-4a27-8cb6-f740d5ffcf80) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:11.898 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 99eeaa57-5103-4a27-8cb6-f740d5ffcf80 in datapath fd854bec-4386-47ab-bc93-a08354b81ab6 unbound from our chassis#033[00m Feb 23 04:59:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:11.900 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd854bec-4386-47ab-bc93-a08354b81ab6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:11 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:11.901 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5589ff04-f058-4543-8779-3ea97bfbdf0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:11 localhost nova_compute[282206]: 2026-02-23 09:59:11.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e148 do_prune osdmap full prune enabled Feb 23 04:59:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e149 e149: 6 total, 6 up, 6 in Feb 23 04:59:12 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e149: 6 total, 6 up, 6 in Feb 23 04:59:13 localhost nova_compute[282206]: 2026-02-23 09:59:13.324 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:13 localhost openstack_network_exporter[245358]: ERROR 09:59:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:59:13 localhost openstack_network_exporter[245358]: Feb 23 04:59:13 localhost openstack_network_exporter[245358]: ERROR 09:59:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:59:13 localhost openstack_network_exporter[245358]: Feb 23 04:59:13 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:13.911 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:13 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:13.913 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:59:13 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:13.916 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:13 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:13.917 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[096eddeb-8fa9-4939-82f3-9cd7aa53b089]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e149 do_prune osdmap full prune enabled Feb 23 04:59:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e150 e150: 6 total, 6 up, 6 in Feb 23 04:59:14 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e150: 6 total, 6 up, 6 in Feb 23 04:59:14 localhost ovn_controller[157695]: 2026-02-23T09:59:14Z|00278|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:59:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:59:14 localhost nova_compute[282206]: 2026-02-23 09:59:14.900 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:59:14 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4123969531' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:59:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:59:14 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4123969531' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:59:14 localhost systemd[1]: tmp-crun.aiE9hR.mount: Deactivated successfully. Feb 23 04:59:14 localhost podman[317055]: 2026-02-23 09:59:14.956110478 +0000 UTC m=+0.128479031 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 04:59:14 localhost podman[317055]: 2026-02-23 09:59:14.994293298 +0000 UTC m=+0.166661801 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:59:15 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:59:15 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:15.024 2 INFO neutron.agent.securitygroups_rpc [None req-785e188e-0451-4720-a843-201f1ea322a4 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:15 localhost dnsmasq[316503]: exiting on receipt of SIGTERM Feb 23 04:59:15 localhost systemd[1]: libpod-09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137.scope: Deactivated successfully. Feb 23 04:59:15 localhost podman[317089]: 2026-02-23 09:59:15.687995129 +0000 UTC m=+0.059136958 container kill 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 23 04:59:15 localhost podman[317102]: 2026-02-23 09:59:15.76018631 +0000 UTC m=+0.058969713 container died 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:59:15 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:15.776 2 INFO neutron.agent.securitygroups_rpc [None req-abe579f6-947b-47f0-ad4d-2e1c9d13dcd9 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:15 localhost podman[317102]: 2026-02-23 09:59:15.787792593 +0000 UTC m=+0.086575956 container cleanup 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:59:15 localhost systemd[1]: libpod-conmon-09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137.scope: Deactivated successfully. Feb 23 04:59:15 localhost podman[317104]: 2026-02-23 09:59:15.830854983 +0000 UTC m=+0.122507925 container remove 09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd854bec-4386-47ab-bc93-a08354b81ab6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS) Feb 23 04:59:15 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:15.854 265541 INFO neutron.agent.dhcp.agent [None req-43cafa2d-f814-489f-ac45-1179456daee8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:59:15 localhost systemd[1]: var-lib-containers-storage-overlay-4b4d99b1f623479e08c1ba172833c2002e032a9da0c51ca920aa6a9e5f04cbc7-merged.mount: Deactivated successfully. Feb 23 04:59:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09dad08e9c72d7d1b99bbbc519a732dc2042ec07079abeaba60f983f4d203137-userdata-shm.mount: Deactivated successfully. Feb 23 04:59:15 localhost systemd[1]: run-netns-qdhcp\x2dfd854bec\x2d4386\x2d47ab\x2dbc93\x2da08354b81ab6.mount: Deactivated successfully. Feb 23 04:59:16 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:16.021 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:59:16 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:16.560 2 INFO neutron.agent.securitygroups_rpc [None req-b25a55ad-3164-4d9c-85c0-7cab33b9b16d f49fd8b6937445efab40892d03b375d7 0421515e6bb54dea8db3ed218999e195 - - default default] Security group rule updated ['c46df023-9a3e-4c54-a0bb-44b675220af4']#033[00m Feb 23 04:59:16 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:16.816 2 INFO neutron.agent.securitygroups_rpc [None req-e4424286-7161-4a51-a79b-4dabbb149f4e f49fd8b6937445efab40892d03b375d7 0421515e6bb54dea8db3ed218999e195 - - default default] Security group rule updated ['c46df023-9a3e-4c54-a0bb-44b675220af4']#033[00m Feb 23 04:59:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e150 do_prune osdmap full prune enabled Feb 23 04:59:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e151 e151: 6 total, 6 up, 6 in Feb 23 04:59:17 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e151: 6 total, 6 up, 6 in Feb 23 04:59:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:59:17 localhost podman[317133]: 2026-02-23 09:59:17.905556992 +0000 UTC m=+0.077989781 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 04:59:17 localhost podman[317133]: 2026-02-23 09:59:17.939292254 +0000 UTC m=+0.111725043 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:59:17 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:59:18 localhost nova_compute[282206]: 2026-02-23 09:59:18.328 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:19 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:19.271 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 10.100.0.2 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:19 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:19.273 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:59:19 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:19.277 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:19 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:19.278 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[3be2bed1-9ca7-4263-b058-83e3eec06cbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:20 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:20.246 2 INFO neutron.agent.securitygroups_rpc [None req-22dd3c2a-2988-4b13-8ce0-dbc57aa028bb 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:21 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:21.233 2 INFO neutron.agent.securitygroups_rpc [None req-f686a950-083b-4059-aa56-fe5b5d1b4f8c 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e151 do_prune osdmap full prune enabled Feb 23 04:59:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e152 e152: 6 total, 6 up, 6 in Feb 23 04:59:22 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e152: 6 total, 6 up, 6 in Feb 23 04:59:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 04:59:22 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 04:59:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:23.232 265541 INFO neutron.agent.linux.ip_lib [None req-3daf7321-3b3e-4175-8d98-c073a2c48a90 - - - - - -] Device tap7d2d4ed7-8f cannot be used as it has no MAC address#033[00m Feb 23 04:59:23 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:23.258 2 INFO neutron.agent.securitygroups_rpc [req-d0cb4007-c7bf-4f23-9a12-bffb679ca45d req-9d22fdae-ec49-499d-83f5-abbd29e1424d f49fd8b6937445efab40892d03b375d7 0421515e6bb54dea8db3ed218999e195 - - default default] Security group member updated ['c46df023-9a3e-4c54-a0bb-44b675220af4']#033[00m Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.293 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:23 localhost kernel: device tap7d2d4ed7-8f entered promiscuous mode Feb 23 04:59:23 localhost NetworkManager[5974]: [1771840763.3025] manager: (tap7d2d4ed7-8f): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.303 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:23 localhost ovn_controller[157695]: 2026-02-23T09:59:23Z|00279|binding|INFO|Claiming lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 for this chassis. Feb 23 04:59:23 localhost ovn_controller[157695]: 2026-02-23T09:59:23Z|00280|binding|INFO|7d2d4ed7-8f1a-44e5-99e2-954f427618a6: Claiming unknown Feb 23 04:59:23 localhost systemd-udevd[317247]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:59:23 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:23.316 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-0c26ff8d-9894-4f75-a5d3-33bc934f6b99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c26ff8d-9894-4f75-a5d3-33bc934f6b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20ee87502ddb4dde82419e1f4302f590', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18c7d4ac-b41a-428e-8eb5-19914e19db45, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7d2d4ed7-8f1a-44e5-99e2-954f427618a6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:23 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:23.318 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 in datapath 0c26ff8d-9894-4f75-a5d3-33bc934f6b99 bound to our chassis#033[00m Feb 23 04:59:23 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:23.321 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port df6f2a94-b7ca-4c07-965a-6e1d3125157c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 23 04:59:23 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:23.321 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0c26ff8d-9894-4f75-a5d3-33bc934f6b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:23 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:23.323 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[8369fffa-d8e7-4de6-b04e-ed0f13a1caa2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:23.320 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:59:22Z, description=, device_id=eb34b095-bb71-40c8-bef1-74bba1c6b6f7, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=76b4fad5-6bb5-46d7-9cb7-fb6f22f09785, ip_allocation=immediate, mac_address=fa:16:3e:df:92:41, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:58:30Z, description=, dns_domain=, id=ff7aa220-5765-44c6-9121-cfbd718241c5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2004318845-network, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34500, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2127, status=ACTIVE, subnets=['8e5f5052-1626-4168-ae6d-3107f2c16e7a'], tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:58:32Z, vlan_transparent=None, network_id=ff7aa220-5765-44c6-9121-cfbd718241c5, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['c46df023-9a3e-4c54-a0bb-44b675220af4'], standard_attr_id=2426, status=DOWN, tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:59:22Z on network ff7aa220-5765-44c6-9121-cfbd718241c5#033[00m Feb 23 04:59:23 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 04:59:23 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.333 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:23 localhost journal[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.338 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:23 localhost ovn_controller[157695]: 2026-02-23T09:59:23Z|00281|binding|INFO|Setting lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 ovn-installed in OVS Feb 23 04:59:23 localhost ovn_controller[157695]: 2026-02-23T09:59:23Z|00282|binding|INFO|Setting lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 up in Southbound Feb 23 04:59:23 localhost journal[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.344 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:23 localhost journal[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device Feb 23 04:59:23 localhost journal[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device Feb 23 04:59:23 localhost journal[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device Feb 23 04:59:23 localhost journal[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device Feb 23 04:59:23 localhost journal[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device Feb 23 04:59:23 localhost journal[231253]: ethtool ioctl error on tap7d2d4ed7-8f: No such device Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.387 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:23 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:23.398 2 INFO neutron.agent.securitygroups_rpc [None req-669db256-0e1b-436d-bf03-d63867ff4f10 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.403 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.404 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.405 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.408 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 04:59:23 localhost nova_compute[282206]: 2026-02-23 09:59:23.423 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:59:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1208066577' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:59:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:59:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1208066577' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:59:23 localhost systemd[1]: tmp-crun.OenR2H.mount: Deactivated successfully. Feb 23 04:59:23 localhost dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 2 addresses Feb 23 04:59:23 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host Feb 23 04:59:23 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts Feb 23 04:59:23 localhost podman[317290]: 2026-02-23 09:59:23.556057211 +0000 UTC m=+0.066975625 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:59:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:23.914 265541 INFO neutron.agent.dhcp.agent [None req-fe33f44d-bb0a-4789-9e57-6d493b09e65c - - - - - -] DHCP configuration for ports {'76b4fad5-6bb5-46d7-9cb7-fb6f22f09785'} is completed#033[00m Feb 23 04:59:24 localhost podman[317354]: Feb 23 04:59:24 localhost podman[317354]: 2026-02-23 09:59:24.409554766 +0000 UTC m=+0.093099136 container create c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0) Feb 23 04:59:24 localhost systemd[1]: Started libpod-conmon-c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e.scope. Feb 23 04:59:24 localhost systemd[1]: tmp-crun.qwq38d.mount: Deactivated successfully. Feb 23 04:59:24 localhost podman[317354]: 2026-02-23 09:59:24.365141091 +0000 UTC m=+0.048685451 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:59:24 localhost systemd[1]: Started libcrun container. Feb 23 04:59:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0aceec28e8080a4a37142df66778b1baec5d9c5e7130e92c6c16ab6cf85905ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:59:24 localhost podman[317354]: 2026-02-23 09:59:24.49692725 +0000 UTC m=+0.180471600 container init c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 04:59:24 localhost podman[317354]: 2026-02-23 09:59:24.506069327 +0000 UTC m=+0.189613657 container start c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0) Feb 23 04:59:24 localhost dnsmasq[317373]: started, version 2.85 cachesize 150 Feb 23 04:59:24 localhost dnsmasq[317373]: DNS service limited to local subnets Feb 23 04:59:24 localhost dnsmasq[317373]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:59:24 localhost dnsmasq[317373]: warning: no upstream servers configured Feb 23 04:59:24 localhost dnsmasq-dhcp[317373]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:59:24 localhost dnsmasq[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/addn_hosts - 0 addresses Feb 23 04:59:24 localhost dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/host Feb 23 04:59:24 localhost dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/opts Feb 23 04:59:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e152 do_prune osdmap full prune enabled Feb 23 04:59:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e153 e153: 6 total, 6 up, 6 in Feb 23 04:59:24 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e153: 6 total, 6 up, 6 in Feb 23 04:59:24 localhost nova_compute[282206]: 2026-02-23 09:59:24.715 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 04:59:24 localhost nova_compute[282206]: 2026-02-23 09:59:24.739 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 04:59:24 localhost nova_compute[282206]: 2026-02-23 09:59:24.739 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 04:59:24 localhost nova_compute[282206]: 2026-02-23 09:59:24.740 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:24 localhost nova_compute[282206]: 2026-02-23 09:59:24.740 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 04:59:24 localhost nova_compute[282206]: 2026-02-23 09:59:24.741 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:24 localhost nova_compute[282206]: 2026-02-23 09:59:24.741 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 23 04:59:24 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:24.816 265541 INFO neutron.agent.dhcp.agent [None req-f1954efa-b521-4467-86ff-026fae6d0d60 - - - - - -] DHCP configuration for ports {'e5d210d8-ac9c-41fd-8d79-4fc011a4e59f'} is completed#033[00m Feb 23 04:59:24 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:24.947 2 INFO neutron.agent.securitygroups_rpc [None req-f9eeb769-9fe9-4ea4-8c2c-b3370b8aaa5a 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:25 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:25.282 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005626466.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:59:22Z, description=, device_id=eb34b095-bb71-40c8-bef1-74bba1c6b6f7, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=tempest-volumesbackupstest-instance-2145282493, extra_dhcp_opts=[], fixed_ips=[], id=76b4fad5-6bb5-46d7-9cb7-fb6f22f09785, ip_allocation=immediate, mac_address=fa:16:3e:df:92:41, name=, network_id=ff7aa220-5765-44c6-9121-cfbd718241c5, port_security_enabled=True, project_id=0421515e6bb54dea8db3ed218999e195, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['c46df023-9a3e-4c54-a0bb-44b675220af4'], standard_attr_id=2426, status=DOWN, tags=[], tenant_id=0421515e6bb54dea8db3ed218999e195, updated_at=2026-02-23T09:59:24Z on network ff7aa220-5765-44c6-9121-cfbd718241c5#033[00m Feb 23 04:59:25 localhost dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 2 addresses Feb 23 04:59:25 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host Feb 23 04:59:25 localhost podman[317391]: 2026-02-23 09:59:25.49196777 +0000 UTC m=+0.058583751 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 04:59:25 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts Feb 23 04:59:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 04:59:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:59:25 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:25.745 265541 INFO neutron.agent.dhcp.agent [None req-fdb320db-3739-482e-9ba7-1e06f4355dcd - - - - - -] DHCP configuration for ports {'76b4fad5-6bb5-46d7-9cb7-fb6f22f09785'} is completed#033[00m Feb 23 04:59:25 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:25.779 2 INFO neutron.agent.securitygroups_rpc [None req-451b3e02-3a26-4970-9980-9e73ed6341a9 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:26 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:26.110 2 INFO neutron.agent.securitygroups_rpc [None req-69ced21c-6b6a-47d6-87bf-2537c99ace20 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:26 localhost nova_compute[282206]: 2026-02-23 09:59:26.194 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:26 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:26.210 2 INFO neutron.agent.securitygroups_rpc [None req-12deaede-70c5-4b22-963c-9ca013b19b91 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 04:59:27 localhost nova_compute[282206]: 2026-02-23 09:59:27.096 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:27 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:27.419 2 INFO neutron.agent.securitygroups_rpc [None req-3c7a21a7-3db2-45cc-9718-507bcfbefba1 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:27 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:27.684 2 INFO neutron.agent.securitygroups_rpc [None req-d9a86df9-d280-468f-9907-620800eed5df 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.334 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.342 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.394 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.395 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.395 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.396 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.396 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:59:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:59:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/427022608' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.782 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.385s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:59:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.853 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:59:28 localhost nova_compute[282206]: 2026-02-23 09:59:28.853 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 04:59:28 localhost systemd[1]: tmp-crun.9T4ZDT.mount: Deactivated successfully. Feb 23 04:59:28 localhost podman[317434]: 2026-02-23 09:59:28.909345245 +0000 UTC m=+0.088728928 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:59:28 localhost podman[317434]: 2026-02-23 09:59:28.915584071 +0000 UTC m=+0.094967754 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 04:59:28 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.048 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.051 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11325MB free_disk=41.7744026184082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.051 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.052 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.348 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.348 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.349 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.411 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.470 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.471 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.494 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.518 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 04:59:29 localhost nova_compute[282206]: 2026-02-23 09:59:29.563 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 04:59:29 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:29.758 2 INFO neutron.agent.securitygroups_rpc [None req-79540abb-7fc2-40cd-b9ab-4b003306a8d3 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 04:59:29 localhost podman[317476]: 2026-02-23 09:59:29.910181408 +0000 UTC m=+0.088356096 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, managed_by=edpm_ansible, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 04:59:29 localhost podman[317476]: 2026-02-23 09:59:29.947344024 +0000 UTC m=+0.125518692 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, version=9.7) Feb 23 04:59:29 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 04:59:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 04:59:30 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1605527560' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 04:59:30 localhost nova_compute[282206]: 2026-02-23 09:59:30.068 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 04:59:30 localhost nova_compute[282206]: 2026-02-23 09:59:30.075 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 04:59:30 localhost nova_compute[282206]: 2026-02-23 09:59:30.090 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 04:59:30 localhost nova_compute[282206]: 2026-02-23 09:59:30.093 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 04:59:30 localhost nova_compute[282206]: 2026-02-23 09:59:30.093 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:59:31 localhost nova_compute[282206]: 2026-02-23 09:59:31.090 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:31 localhost nova_compute[282206]: 2026-02-23 09:59:31.091 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:31 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:31.264 2 INFO neutron.agent.securitygroups_rpc [None req-5bebbbf2-890e-4b6f-92fc-689753c8df12 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:31 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:31.363 2 INFO neutron.agent.securitygroups_rpc [None req-6d65bfbe-b42e-4b67-8b3f-02498c056686 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:32 localhost nova_compute[282206]: 2026-02-23 09:59:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e153 do_prune osdmap full prune enabled Feb 23 04:59:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e154 e154: 6 total, 6 up, 6 in Feb 23 04:59:32 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e154: 6 total, 6 up, 6 in Feb 23 04:59:32 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:32.242 2 INFO neutron.agent.securitygroups_rpc [None req-b9f58e08-6424-4ab4-86b8-227dc5c4bdfc 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:32 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:32.259 2 INFO neutron.agent.securitygroups_rpc [None req-9acdbf04-3277-47bc-8ea9-7f279ed4a9a4 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:32.337 265541 INFO neutron.agent.linux.ip_lib [None req-ce05a959-3235-471f-bc6b-decd45cf34af - - - - - -] Device tapeff7547d-16 cannot be used as it has no MAC address#033[00m Feb 23 04:59:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:32.358 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:32.359 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 04:59:32 localhost nova_compute[282206]: 2026-02-23 09:59:32.418 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:32 localhost kernel: device tapeff7547d-16 entered promiscuous mode Feb 23 04:59:32 localhost ovn_controller[157695]: 2026-02-23T09:59:32Z|00283|binding|INFO|Claiming lport eff7547d-1684-4a00-829e-9369e5af1a4c for this chassis. Feb 23 04:59:32 localhost ovn_controller[157695]: 2026-02-23T09:59:32Z|00284|binding|INFO|eff7547d-1684-4a00-829e-9369e5af1a4c: Claiming unknown Feb 23 04:59:32 localhost nova_compute[282206]: 2026-02-23 09:59:32.426 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:32 localhost NetworkManager[5974]: [1771840772.4282] manager: (tapeff7547d-16): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Feb 23 04:59:32 localhost systemd-udevd[317508]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:59:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:32.437 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-d9ae15ed-aa9d-4bce-9192-334c9725a10c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9ae15ed-aa9d-4bce-9192-334c9725a10c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20ee87502ddb4dde82419e1f4302f590', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=891eac08-7b13-4a8b-bfb9-8821cd51b516, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eff7547d-1684-4a00-829e-9369e5af1a4c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:32.439 163572 INFO neutron.agent.ovn.metadata.agent [-] Port eff7547d-1684-4a00-829e-9369e5af1a4c in datapath d9ae15ed-aa9d-4bce-9192-334c9725a10c bound to our chassis#033[00m Feb 23 04:59:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:32.441 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d9ae15ed-aa9d-4bce-9192-334c9725a10c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:59:32 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:32.442 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5e81222e-caeb-4eb5-b178-435743de9a4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:32 localhost journal[231253]: ethtool ioctl error on tapeff7547d-16: No such device Feb 23 04:59:32 localhost journal[231253]: ethtool ioctl error on tapeff7547d-16: No such device Feb 23 04:59:32 localhost ovn_controller[157695]: 2026-02-23T09:59:32Z|00285|binding|INFO|Setting lport eff7547d-1684-4a00-829e-9369e5af1a4c ovn-installed in OVS Feb 23 04:59:32 localhost ovn_controller[157695]: 2026-02-23T09:59:32Z|00286|binding|INFO|Setting lport eff7547d-1684-4a00-829e-9369e5af1a4c up in Southbound Feb 23 04:59:32 localhost nova_compute[282206]: 2026-02-23 09:59:32.467 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:32 localhost journal[231253]: ethtool ioctl error on tapeff7547d-16: No such device Feb 23 04:59:32 localhost journal[231253]: ethtool ioctl error on tapeff7547d-16: No such device Feb 23 04:59:32 localhost journal[231253]: ethtool ioctl error on tapeff7547d-16: No such device Feb 23 04:59:32 localhost journal[231253]: ethtool ioctl error on tapeff7547d-16: No such device Feb 23 04:59:32 localhost journal[231253]: ethtool ioctl error on tapeff7547d-16: No such device Feb 23 04:59:32 localhost journal[231253]: ethtool ioctl error on tapeff7547d-16: No such device Feb 23 04:59:32 localhost nova_compute[282206]: 2026-02-23 09:59:32.507 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:32 localhost nova_compute[282206]: 2026-02-23 09:59:32.536 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:33 localhost nova_compute[282206]: 2026-02-23 09:59:33.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:33 localhost nova_compute[282206]: 2026-02-23 09:59:33.118 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:33 localhost nova_compute[282206]: 2026-02-23 09:59:33.336 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:33 localhost nova_compute[282206]: 2026-02-23 09:59:33.347 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:33 localhost podman[317578]: Feb 23 04:59:33 localhost podman[317578]: 2026-02-23 09:59:33.429837607 +0000 UTC m=+0.102735718 container create 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216) Feb 23 04:59:33 localhost podman[317578]: 2026-02-23 09:59:33.379509156 +0000 UTC m=+0.052407317 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:59:33 localhost systemd[1]: Started libpod-conmon-1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242.scope. Feb 23 04:59:33 localhost systemd[1]: Started libcrun container. Feb 23 04:59:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/182134e9323da2a38d8aba6d82a6ee7725f82a27095416b853817f55614cd542/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:59:33 localhost podman[317578]: 2026-02-23 09:59:33.512726239 +0000 UTC m=+0.185624350 container init 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216) Feb 23 04:59:33 localhost podman[317578]: 2026-02-23 09:59:33.522352012 +0000 UTC m=+0.195250123 container start 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0) Feb 23 04:59:33 localhost dnsmasq[317597]: started, version 2.85 cachesize 150 Feb 23 04:59:33 localhost dnsmasq[317597]: DNS service limited to local subnets Feb 23 04:59:33 localhost dnsmasq[317597]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:59:33 localhost dnsmasq[317597]: warning: no upstream servers configured Feb 23 04:59:33 localhost dnsmasq-dhcp[317597]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 04:59:33 localhost dnsmasq[317597]: read /var/lib/neutron/dhcp/d9ae15ed-aa9d-4bce-9192-334c9725a10c/addn_hosts - 0 addresses Feb 23 04:59:33 localhost dnsmasq-dhcp[317597]: read /var/lib/neutron/dhcp/d9ae15ed-aa9d-4bce-9192-334c9725a10c/host Feb 23 04:59:33 localhost dnsmasq-dhcp[317597]: read /var/lib/neutron/dhcp/d9ae15ed-aa9d-4bce-9192-334c9725a10c/opts Feb 23 04:59:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:33.656 265541 INFO neutron.agent.dhcp.agent [None req-e5840c49-f204-4b66-a790-f8a94ecb8e75 - - - - - -] DHCP configuration for ports {'c33a8bfd-5ed5-47c7-86ae-d437d28631db'} is completed#033[00m Feb 23 04:59:33 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:33.785 2 INFO neutron.agent.securitygroups_rpc [None req-fd6425dd-2055-4cbb-a840-beb509d9498c 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:33 localhost nova_compute[282206]: 2026-02-23 09:59:33.918 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e154 do_prune osdmap full prune enabled Feb 23 04:59:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e155 e155: 6 total, 6 up, 6 in Feb 23 04:59:34 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e155: 6 total, 6 up, 6 in Feb 23 04:59:35 localhost nova_compute[282206]: 2026-02-23 09:59:35.076 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:35 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:35.953 2 INFO neutron.agent.securitygroups_rpc [None req-98b565c4-bd61-4452-bccd-bd5e5d274484 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:36 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:36.038 2 INFO neutron.agent.securitygroups_rpc [None req-1e0c6ce2-0fa1-4c95-beb4-0c824ad3b485 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e155 do_prune osdmap full prune enabled Feb 23 04:59:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e156 e156: 6 total, 6 up, 6 in Feb 23 04:59:36 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e156: 6 total, 6 up, 6 in Feb 23 04:59:36 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:36.627 2 INFO neutron.agent.securitygroups_rpc [None req-8ded7049-1595-4d50-af9e-057a8bcc7a90 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:36 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:36.810 2 INFO neutron.agent.securitygroups_rpc [None req-0d2af92f-c1c3-481d-a685-438d421cfdf3 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0. Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.121083) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49 Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777121148, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2135, "num_deletes": 263, "total_data_size": 2136994, "memory_usage": 2193680, "flush_reason": "Manual Compaction"} Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777134616, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2069328, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27575, "largest_seqno": 29709, "table_properties": {"data_size": 2060472, "index_size": 5429, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19826, "raw_average_key_size": 20, "raw_value_size": 2041984, "raw_average_value_size": 2158, "num_data_blocks": 236, "num_entries": 946, "num_filter_entries": 946, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840639, "oldest_key_time": 1771840639, "file_creation_time": 1771840777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}} Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 13575 microseconds, and 5564 cpu microseconds. Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.134666) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2069328 bytes OK Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.134687) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137076) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137096) EVENT_LOG_v1 {"time_micros": 1771840777137090, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137120) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2127774, prev total WAL file size 2128379, number of live WAL files 2. Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137924) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303139' seq:72057594037927935, type:22 .. '6C6F676D0034323731' seq:0, type:0; will stop at (end) Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2020KB)], [48(16MB)] Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777137984, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 18874849, "oldest_snapshot_seqno": -1} Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 12994 keys, 18375915 bytes, temperature: kUnknown Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777241501, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 18375915, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18298438, "index_size": 43904, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32517, "raw_key_size": 347070, "raw_average_key_size": 26, "raw_value_size": 18073886, "raw_average_value_size": 1390, "num_data_blocks": 1680, "num_entries": 12994, "num_filter_entries": 12994, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}} Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.241826) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 18375915 bytes Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.243971) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.2 rd, 177.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 16.0 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(18.0) write-amplify(8.9) OK, records in: 13535, records dropped: 541 output_compression: NoCompression Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.243998) EVENT_LOG_v1 {"time_micros": 1771840777243986, "job": 28, "event": "compaction_finished", "compaction_time_micros": 103592, "compaction_time_cpu_micros": 53826, "output_level": 6, "num_output_files": 1, "total_output_size": 18375915, "num_input_records": 13535, "num_output_records": 12994, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777244801, "job": 28, "event": "table_file_deletion", "file_number": 50} Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840777248215, "job": 28, "event": "table_file_deletion", "file_number": 48} Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.137781) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248287) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248296) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248299) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248302) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:37 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:37.248307) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e156 do_prune osdmap full prune enabled Feb 23 04:59:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e157 e157: 6 total, 6 up, 6 in Feb 23 04:59:37 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e157: 6 total, 6 up, 6 in Feb 23 04:59:37 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:37.957 2 INFO neutron.agent.securitygroups_rpc [None req-4c34a092-1bee-4730-b083-a159f9af8bdd 446d22b7a9534ee1a139ffd6bef47f3c 582130ae966043d38e47148509dbe266 - - default default] Security group member updated ['ee3954e0-cd09-4323-ae3c-c3f1e63159bd']#033[00m Feb 23 04:59:38 localhost sshd[317598]: main: sshd: ssh-rsa algorithm is disabled Feb 23 04:59:38 localhost nova_compute[282206]: 2026-02-23 09:59:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:38 localhost nova_compute[282206]: 2026-02-23 09:59:38.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 23 04:59:38 localhost nova_compute[282206]: 2026-02-23 09:59:38.079 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 23 04:59:38 localhost nova_compute[282206]: 2026-02-23 09:59:38.368 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:38 localhost nova_compute[282206]: 2026-02-23 09:59:38.370 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e157 do_prune osdmap full prune enabled Feb 23 04:59:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e158 e158: 6 total, 6 up, 6 in Feb 23 04:59:38 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e158: 6 total, 6 up, 6 in Feb 23 04:59:38 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:38.598 2 INFO neutron.agent.securitygroups_rpc [None req-32c78c98-6e6f-4ee3-a8c4-559f4243f6cc 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:38 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:38.977 2 INFO neutron.agent.securitygroups_rpc [None req-6b31b9c1-f7cb-4728-a77a-3dbb7699a58d 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:39 localhost podman[242954]: time="2026-02-23T09:59:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 04:59:39 localhost podman[242954]: @ - - [23/Feb/2026:09:59:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162552 "" "Go-http-client/1.1" Feb 23 04:59:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e158 do_prune osdmap full prune enabled Feb 23 04:59:39 localhost podman[242954]: @ - - [23/Feb/2026:09:59:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20248 "" "Go-http-client/1.1" Feb 23 04:59:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e159 e159: 6 total, 6 up, 6 in Feb 23 04:59:39 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e159: 6 total, 6 up, 6 in Feb 23 04:59:40 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:40.362 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 04:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 04:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 04:59:40 localhost systemd[1]: tmp-crun.HSRSWQ.mount: Deactivated successfully. Feb 23 04:59:40 localhost podman[317601]: 2026-02-23 09:59:40.97438535 +0000 UTC m=+0.140841645 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 04:59:40 localhost podman[317601]: 2026-02-23 09:59:40.982376222 +0000 UTC m=+0.148832537 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 04:59:40 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 04:59:41 localhost podman[317600]: 2026-02-23 09:59:40.938181683 +0000 UTC m=+0.109043655 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:59:41 localhost podman[317600]: 2026-02-23 09:59:41.06800895 +0000 UTC m=+0.238870992 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216) Feb 23 04:59:41 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 04:59:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e159 do_prune osdmap full prune enabled Feb 23 04:59:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e160 e160: 6 total, 6 up, 6 in Feb 23 04:59:41 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e160: 6 total, 6 up, 6 in Feb 23 04:59:41 localhost systemd[1]: tmp-crun.5SI1FK.mount: Deactivated successfully. Feb 23 04:59:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:42 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:42.410 2 INFO neutron.agent.securitygroups_rpc [None req-ea1ece96-8ed7-4628-9eb3-8fcb879af238 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:43 localhost nova_compute[282206]: 2026-02-23 09:59:43.371 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:43 localhost openstack_network_exporter[245358]: ERROR 09:59:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 04:59:43 localhost openstack_network_exporter[245358]: Feb 23 04:59:43 localhost openstack_network_exporter[245358]: ERROR 09:59:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 04:59:43 localhost openstack_network_exporter[245358]: Feb 23 04:59:43 localhost nova_compute[282206]: 2026-02-23 09:59:43.380 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:43.391 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:59:42Z, description=, device_id=e923d3f9-631b-4cb5-b450-aef4f64e2d2c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b75036d4-c2df-48cf-944c-8a783718cf0a, ip_allocation=immediate, mac_address=fa:16:3e:53:1e:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:59:20Z, description=, dns_domain=, id=0c26ff8d-9894-4f75-a5d3-33bc934f6b99, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-1148185001, port_security_enabled=True, project_id=20ee87502ddb4dde82419e1f4302f590, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48707, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2410, status=ACTIVE, subnets=['4203c185-aad1-4b87-8334-57a63d4b2075'], tags=[], tenant_id=20ee87502ddb4dde82419e1f4302f590, updated_at=2026-02-23T09:59:21Z, vlan_transparent=None, network_id=0c26ff8d-9894-4f75-a5d3-33bc934f6b99, port_security_enabled=False, project_id=20ee87502ddb4dde82419e1f4302f590, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2498, status=DOWN, tags=[], tenant_id=20ee87502ddb4dde82419e1f4302f590, updated_at=2026-02-23T09:59:42Z on network 0c26ff8d-9894-4f75-a5d3-33bc934f6b99#033[00m Feb 23 04:59:43 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:43.477 2 INFO neutron.agent.securitygroups_rpc [None req-cd7d52bd-6d88-4de8-a801-ff1f615f2e4b 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:43 localhost podman[317662]: 2026-02-23 09:59:43.655337368 +0000 UTC m=+0.047533994 container kill c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 04:59:43 localhost dnsmasq[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/addn_hosts - 1 addresses Feb 23 04:59:43 localhost dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/host Feb 23 04:59:43 localhost dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/opts Feb 23 04:59:43 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:43.842 265541 INFO neutron.agent.dhcp.agent [None req-1578ebdc-a17c-47a5-9fd6-8d19d4f96456 - - - - - -] DHCP configuration for ports {'b75036d4-c2df-48cf-944c-8a783718cf0a'} is completed#033[00m Feb 23 04:59:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e160 do_prune osdmap full prune enabled Feb 23 04:59:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e161 e161: 6 total, 6 up, 6 in Feb 23 04:59:43 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e161: 6 total, 6 up, 6 in Feb 23 04:59:44 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:44.576 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T09:59:42Z, description=, device_id=e923d3f9-631b-4cb5-b450-aef4f64e2d2c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b75036d4-c2df-48cf-944c-8a783718cf0a, ip_allocation=immediate, mac_address=fa:16:3e:53:1e:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T09:59:20Z, description=, dns_domain=, id=0c26ff8d-9894-4f75-a5d3-33bc934f6b99, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-1148185001, port_security_enabled=True, project_id=20ee87502ddb4dde82419e1f4302f590, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48707, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2410, status=ACTIVE, subnets=['4203c185-aad1-4b87-8334-57a63d4b2075'], tags=[], tenant_id=20ee87502ddb4dde82419e1f4302f590, updated_at=2026-02-23T09:59:21Z, vlan_transparent=None, network_id=0c26ff8d-9894-4f75-a5d3-33bc934f6b99, port_security_enabled=False, project_id=20ee87502ddb4dde82419e1f4302f590, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2498, status=DOWN, tags=[], tenant_id=20ee87502ddb4dde82419e1f4302f590, updated_at=2026-02-23T09:59:42Z on network 0c26ff8d-9894-4f75-a5d3-33bc934f6b99#033[00m Feb 23 04:59:44 localhost dnsmasq[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/addn_hosts - 1 addresses Feb 23 04:59:44 localhost podman[317700]: 2026-02-23 09:59:44.809474614 +0000 UTC m=+0.061482131 container kill c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:59:44 localhost dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/host Feb 23 04:59:44 localhost dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/opts Feb 23 04:59:45 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:45.162 265541 INFO neutron.agent.dhcp.agent [None req-f9af8222-d352-4117-a3c1-39f02d72ab62 - - - - - -] DHCP configuration for ports {'b75036d4-c2df-48cf-944c-8a783718cf0a'} is completed#033[00m Feb 23 04:59:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 04:59:45 localhost podman[317720]: 2026-02-23 09:59:45.925958388 +0000 UTC m=+0.092560608 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute) Feb 23 04:59:45 localhost podman[317720]: 2026-02-23 09:59:45.966369288 +0000 UTC m=+0.132971548 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:59:45 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 04:59:46 localhost dnsmasq[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/addn_hosts - 0 addresses Feb 23 04:59:46 localhost dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/host Feb 23 04:59:46 localhost dnsmasq-dhcp[317373]: read /var/lib/neutron/dhcp/0c26ff8d-9894-4f75-a5d3-33bc934f6b99/opts Feb 23 04:59:46 localhost podman[317754]: 2026-02-23 09:59:46.561977073 +0000 UTC m=+0.058353054 container kill c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:59:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e161 do_prune osdmap full prune enabled Feb 23 04:59:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e162 e162: 6 total, 6 up, 6 in Feb 23 04:59:47 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e162: 6 total, 6 up, 6 in Feb 23 04:59:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e162 do_prune osdmap full prune enabled Feb 23 04:59:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e163 e163: 6 total, 6 up, 6 in Feb 23 04:59:47 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e163: 6 total, 6 up, 6 in Feb 23 04:59:47 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:47.282 265541 INFO neutron.agent.linux.ip_lib [None req-b474b464-2d05-4bfb-a71a-3e1191b45e29 - - - - - -] Device tapcf62e779-21 cannot be used as it has no MAC address#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.343 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:47 localhost kernel: device tapcf62e779-21 entered promiscuous mode Feb 23 04:59:47 localhost NetworkManager[5974]: [1771840787.3507] manager: (tapcf62e779-21): new Generic device (/org/freedesktop/NetworkManager/Devices/48) Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.350 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:47 localhost ovn_controller[157695]: 2026-02-23T09:59:47Z|00287|binding|INFO|Claiming lport cf62e779-21c9-44d1-992a-8d67e75ee9a4 for this chassis. Feb 23 04:59:47 localhost ovn_controller[157695]: 2026-02-23T09:59:47Z|00288|binding|INFO|cf62e779-21c9-44d1-992a-8d67e75ee9a4: Claiming unknown Feb 23 04:59:47 localhost systemd-udevd[317784]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.360 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.243/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5e40f037-bddd-4e41-9358-072288273862', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e40f037-bddd-4e41-9358-072288273862', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd636fd1-c9ff-44d0-b6d5-3a4c5f8e69de, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cf62e779-21c9-44d1-992a-8d67e75ee9a4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.362 163572 INFO neutron.agent.ovn.metadata.agent [-] Port cf62e779-21c9-44d1-992a-8d67e75ee9a4 in datapath 5e40f037-bddd-4e41-9358-072288273862 bound to our chassis#033[00m Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.364 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port 034ef5b0-24bc-4eeb-b7eb-fa73747ebcf1 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.364 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e40f037-bddd-4e41-9358-072288273862, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.365 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[ad03834a-1390-48cb-8b71-f95cfc79a5c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:47 localhost journal[231253]: ethtool ioctl error on tapcf62e779-21: No such device Feb 23 04:59:47 localhost ovn_controller[157695]: 2026-02-23T09:59:47Z|00289|binding|INFO|Setting lport cf62e779-21c9-44d1-992a-8d67e75ee9a4 ovn-installed in OVS Feb 23 04:59:47 localhost ovn_controller[157695]: 2026-02-23T09:59:47Z|00290|binding|INFO|Setting lport cf62e779-21c9-44d1-992a-8d67e75ee9a4 up in Southbound Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.404 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:47 localhost journal[231253]: ethtool ioctl error on tapcf62e779-21: No such device Feb 23 04:59:47 localhost journal[231253]: ethtool ioctl error on tapcf62e779-21: No such device Feb 23 04:59:47 localhost journal[231253]: ethtool ioctl error on tapcf62e779-21: No such device Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.416 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:47 localhost kernel: device tap7d2d4ed7-8f left promiscuous mode Feb 23 04:59:47 localhost ovn_controller[157695]: 2026-02-23T09:59:47Z|00291|binding|INFO|Releasing lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 from this chassis (sb_readonly=0) Feb 23 04:59:47 localhost ovn_controller[157695]: 2026-02-23T09:59:47Z|00292|binding|INFO|Setting lport 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 down in Southbound Feb 23 04:59:47 localhost journal[231253]: ethtool ioctl error on tapcf62e779-21: No such device Feb 23 04:59:47 localhost journal[231253]: ethtool ioctl error on tapcf62e779-21: No such device Feb 23 04:59:47 localhost journal[231253]: ethtool ioctl error on tapcf62e779-21: No such device Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.432 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-0c26ff8d-9894-4f75-a5d3-33bc934f6b99', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c26ff8d-9894-4f75-a5d3-33bc934f6b99', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20ee87502ddb4dde82419e1f4302f590', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18c7d4ac-b41a-428e-8eb5-19914e19db45, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7d2d4ed7-8f1a-44e5-99e2-954f427618a6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.433 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 7d2d4ed7-8f1a-44e5-99e2-954f427618a6 in datapath 0c26ff8d-9894-4f75-a5d3-33bc934f6b99 unbound from our chassis#033[00m Feb 23 04:59:47 localhost journal[231253]: ethtool ioctl error on tapcf62e779-21: No such device Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.435 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0c26ff8d-9894-4f75-a5d3-33bc934f6b99, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:47 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:47.436 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[be8de939-bd72-48c9-a60a-34a3273bff0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.464 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.468 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.818 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.846 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Triggering sync for uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.847 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.847 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:59:47 localhost nova_compute[282206]: 2026-02-23 09:59:47.876 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:59:48 localhost podman[317856]: Feb 23 04:59:48 localhost podman[317856]: 2026-02-23 09:59:48.345090174 +0000 UTC m=+0.099851548 container create 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Feb 23 04:59:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 04:59:48 localhost nova_compute[282206]: 2026-02-23 09:59:48.374 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:48 localhost nova_compute[282206]: 2026-02-23 09:59:48.380 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:48 localhost systemd[1]: Started libpod-conmon-993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a.scope. Feb 23 04:59:48 localhost podman[317856]: 2026-02-23 09:59:48.296031802 +0000 UTC m=+0.050793226 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 04:59:48 localhost systemd[1]: tmp-crun.lOjpOE.mount: Deactivated successfully. Feb 23 04:59:48 localhost systemd[1]: Started libcrun container. Feb 23 04:59:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db56fcd2726acc1983ef9844d0380fb1390c65eda950b0c88ea7f886d5e3e2ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 04:59:48 localhost podman[317856]: 2026-02-23 09:59:48.441807041 +0000 UTC m=+0.196568425 container init 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 04:59:48 localhost dnsmasq[317885]: started, version 2.85 cachesize 150 Feb 23 04:59:48 localhost dnsmasq[317885]: DNS service limited to local subnets Feb 23 04:59:48 localhost dnsmasq[317885]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 04:59:48 localhost dnsmasq[317885]: warning: no upstream servers configured Feb 23 04:59:48 localhost dnsmasq-dhcp[317885]: DHCP, static leases only on 10.100.255.240, lease time 1d Feb 23 04:59:48 localhost dnsmasq[317885]: read /var/lib/neutron/dhcp/5e40f037-bddd-4e41-9358-072288273862/addn_hosts - 0 addresses Feb 23 04:59:48 localhost dnsmasq-dhcp[317885]: read /var/lib/neutron/dhcp/5e40f037-bddd-4e41-9358-072288273862/host Feb 23 04:59:48 localhost dnsmasq-dhcp[317885]: read /var/lib/neutron/dhcp/5e40f037-bddd-4e41-9358-072288273862/opts Feb 23 04:59:48 localhost podman[317869]: 2026-02-23 09:59:48.486726841 +0000 UTC m=+0.102472208 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 04:59:48 localhost podman[317856]: 2026-02-23 09:59:48.502537868 +0000 UTC m=+0.257299242 container start 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:59:48 localhost podman[317869]: 2026-02-23 09:59:48.520231863 +0000 UTC m=+0.135977250 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Feb 23 04:59:48 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 04:59:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:48.559 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 04:59:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:48.560 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 04:59:48 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 04:59:48 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:48.722 2 INFO neutron.agent.securitygroups_rpc [None req-48538392-c56b-4b21-9b40-cb72b13d2341 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:48 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:48.776 265541 INFO neutron.agent.dhcp.agent [None req-c84393dc-9795-449b-be53-2b977d3475e7 - - - - - -] DHCP configuration for ports {'580598fc-fc38-454c-a9c1-155d41252ed8'} is completed#033[00m Feb 23 04:59:49 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:49.352 2 INFO neutron.agent.securitygroups_rpc [None req-fe77e4f6-f29f-4a3f-9227-290ccdc2ae2e 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:49 localhost dnsmasq[317597]: exiting on receipt of SIGTERM Feb 23 04:59:49 localhost podman[317911]: 2026-02-23 09:59:49.67184569 +0000 UTC m=+0.066237990 container kill 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0) Feb 23 04:59:49 localhost systemd[1]: tmp-crun.vGg7jO.mount: Deactivated successfully. Feb 23 04:59:49 localhost systemd[1]: libpod-1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242.scope: Deactivated successfully. Feb 23 04:59:49 localhost podman[317926]: 2026-02-23 09:59:49.749499139 +0000 UTC m=+0.063200545 container died 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 04:59:49 localhost podman[317926]: 2026-02-23 09:59:49.784685655 +0000 UTC m=+0.098387021 container cleanup 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 04:59:49 localhost systemd[1]: libpod-conmon-1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242.scope: Deactivated successfully. Feb 23 04:59:49 localhost podman[317928]: 2026-02-23 09:59:49.8374079 +0000 UTC m=+0.141294808 container remove 1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d9ae15ed-aa9d-4bce-9192-334c9725a10c, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 04:59:49 localhost ovn_controller[157695]: 2026-02-23T09:59:49Z|00293|binding|INFO|Releasing lport eff7547d-1684-4a00-829e-9369e5af1a4c from this chassis (sb_readonly=0) Feb 23 04:59:49 localhost kernel: device tapeff7547d-16 left promiscuous mode Feb 23 04:59:49 localhost ovn_controller[157695]: 2026-02-23T09:59:49Z|00294|binding|INFO|Setting lport eff7547d-1684-4a00-829e-9369e5af1a4c down in Southbound Feb 23 04:59:49 localhost nova_compute[282206]: 2026-02-23 09:59:49.901 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:49.919 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-d9ae15ed-aa9d-4bce-9192-334c9725a10c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9ae15ed-aa9d-4bce-9192-334c9725a10c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20ee87502ddb4dde82419e1f4302f590', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=891eac08-7b13-4a8b-bfb9-8821cd51b516, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=eff7547d-1684-4a00-829e-9369e5af1a4c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:49.921 163572 INFO neutron.agent.ovn.metadata.agent [-] Port eff7547d-1684-4a00-829e-9369e5af1a4c in datapath d9ae15ed-aa9d-4bce-9192-334c9725a10c unbound from our chassis#033[00m Feb 23 04:59:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:49.924 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9ae15ed-aa9d-4bce-9192-334c9725a10c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:49 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:49.925 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[827accd0-cf3d-41c3-b33b-9c80310624ff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:49 localhost nova_compute[282206]: 2026-02-23 09:59:49.931 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:50.153 265541 INFO neutron.agent.dhcp.agent [None req-c10188fe-5a98-4e71-ab93-ea285fb52e20 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:59:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:50.154 265541 INFO neutron.agent.dhcp.agent [None req-c10188fe-5a98-4e71-ab93-ea285fb52e20 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:59:50 localhost systemd[1]: var-lib-containers-storage-overlay-182134e9323da2a38d8aba6d82a6ee7725f82a27095416b853817f55614cd542-merged.mount: Deactivated successfully. Feb 23 04:59:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f2b0c282578fa14ce7f3234a36f89894c3bfd2be0e1124588d44437a5f72242-userdata-shm.mount: Deactivated successfully. Feb 23 04:59:50 localhost systemd[1]: run-netns-qdhcp\x2dd9ae15ed\x2daa9d\x2d4bce\x2d9192\x2d334c9725a10c.mount: Deactivated successfully. Feb 23 04:59:50 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:50.548 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:59:50 localhost ovn_controller[157695]: 2026-02-23T09:59:50Z|00295|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:59:50 localhost nova_compute[282206]: 2026-02-23 09:59:50.789 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:51 localhost dnsmasq[317373]: exiting on receipt of SIGTERM Feb 23 04:59:51 localhost podman[317973]: 2026-02-23 09:59:51.826364706 +0000 UTC m=+0.059537711 container kill c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:59:51 localhost systemd[1]: libpod-c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e.scope: Deactivated successfully. Feb 23 04:59:51 localhost podman[317987]: 2026-02-23 09:59:51.898346807 +0000 UTC m=+0.058585671 container died c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 04:59:51 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e-userdata-shm.mount: Deactivated successfully. Feb 23 04:59:51 localhost podman[317987]: 2026-02-23 09:59:51.931719545 +0000 UTC m=+0.091958369 container cleanup c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 23 04:59:51 localhost systemd[1]: libpod-conmon-c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e.scope: Deactivated successfully. Feb 23 04:59:51 localhost podman[317989]: 2026-02-23 09:59:51.985309788 +0000 UTC m=+0.136993734 container remove c234c6af3cf9946dc051be1dcad89c50800ef76a490b597d3a6cad540a59a53e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0c26ff8d-9894-4f75-a5d3-33bc934f6b99, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0) Feb 23 04:59:52 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:52.015 265541 INFO neutron.agent.dhcp.agent [None req-72146101-3418-4c72-af6e-09010364e98d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:59:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e163 do_prune osdmap full prune enabled Feb 23 04:59:52 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:52.218 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 04:59:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e164 e164: 6 total, 6 up, 6 in Feb 23 04:59:52 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e164: 6 total, 6 up, 6 in Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0. Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.245447) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52 Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792245487, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 523, "num_deletes": 255, "total_data_size": 334964, "memory_usage": 344728, "flush_reason": "Manual Compaction"} Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792251181, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 328652, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29710, "largest_seqno": 30232, "table_properties": {"data_size": 325799, "index_size": 836, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7459, "raw_average_key_size": 20, "raw_value_size": 319910, "raw_average_value_size": 878, "num_data_blocks": 36, "num_entries": 364, "num_filter_entries": 364, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840777, "oldest_key_time": 1771840777, "file_creation_time": 1771840792, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}} Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 5775 microseconds, and 1799 cpu microseconds. Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.251223) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 328652 bytes OK Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.251245) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.256409) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.256431) EVENT_LOG_v1 {"time_micros": 1771840792256424, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.256453) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 331876, prev total WAL file size 331876, number of live WAL files 2. Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.257135) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end) Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(320KB)], [51(17MB)] Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792257183, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 18704567, "oldest_snapshot_seqno": -1} Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 12833 keys, 17422861 bytes, temperature: kUnknown Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792380444, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 17422861, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17347442, "index_size": 42230, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32133, "raw_key_size": 344271, "raw_average_key_size": 26, "raw_value_size": 17126615, "raw_average_value_size": 1334, "num_data_blocks": 1604, "num_entries": 12833, "num_filter_entries": 12833, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840792, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}} Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.381197) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 17422861 bytes Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.385346) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.7 rd, 141.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 17.5 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(109.9) write-amplify(53.0) OK, records in: 13358, records dropped: 525 output_compression: NoCompression Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.385377) EVENT_LOG_v1 {"time_micros": 1771840792385364, "job": 30, "event": "compaction_finished", "compaction_time_micros": 123324, "compaction_time_cpu_micros": 49010, "output_level": 6, "num_output_files": 1, "total_output_size": 17422861, "num_input_records": 13358, "num_output_records": 12833, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792385609, "job": 30, "event": "table_file_deletion", "file_number": 53} Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840792388831, "job": 30, "event": "table_file_deletion", "file_number": 51} Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.257047) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388926) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388930) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:52 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:52.388938) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:52 localhost systemd[1]: var-lib-containers-storage-overlay-0aceec28e8080a4a37142df66778b1baec5d9c5e7130e92c6c16ab6cf85905ce-merged.mount: Deactivated successfully. Feb 23 04:59:52 localhost systemd[1]: run-netns-qdhcp\x2d0c26ff8d\x2d9894\x2d4f75\x2da5d3\x2d33bc934f6b99.mount: Deactivated successfully. Feb 23 04:59:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:53.175 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8:0:1:f816:3eff:fe4b:79d0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f6a957-d664-4eea-abeb-3145dfe24d16, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f9f417b1-97ff-4889-a004-ebc80afbc25a) old=Port_Binding(mac=['fa:16:3e:4b:79:d0 2001:db8::f816:3eff:fe4b:79d0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe4b:79d0/64', 'neutron:device_id': 'ovnmeta-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73ae202-1b92-44f9-b55c-a6eba0a348b0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5bed3d4ae9fd4fe7b9440b4587246c14', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:53.178 163572 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f9f417b1-97ff-4889-a004-ebc80afbc25a in datapath c73ae202-1b92-44f9-b55c-a6eba0a348b0 updated#033[00m Feb 23 04:59:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:53.182 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73ae202-1b92-44f9-b55c-a6eba0a348b0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 04:59:53 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:53.183 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b705a777-2894-4fef-9fa7-aebd73e52ebe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:53 localhost nova_compute[282206]: 2026-02-23 09:59:53.415 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:53 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:53.710 2 INFO neutron.agent.securitygroups_rpc [None req-e07daaf7-bd51-4e1d-9776-48aaf7c6d99d 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:54 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:54.178 2 INFO neutron.agent.securitygroups_rpc [None req-0c26025a-3ebd-4293-969f-53f7049344a7 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 04:59:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e164 do_prune osdmap full prune enabled Feb 23 04:59:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e165 e165: 6 total, 6 up, 6 in Feb 23 04:59:55 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e165: 6 total, 6 up, 6 in Feb 23 04:59:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e165 do_prune osdmap full prune enabled Feb 23 04:59:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e166 e166: 6 total, 6 up, 6 in Feb 23 04:59:56 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e166: 6 total, 6 up, 6 in Feb 23 04:59:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 04:59:57 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1368292100' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 04:59:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 04:59:57 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1368292100' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 04:59:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0. Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.179132) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55 Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797179224, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 326, "num_deletes": 250, "total_data_size": 97591, "memory_usage": 103056, "flush_reason": "Manual Compaction"} Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797182192, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 96107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30233, "largest_seqno": 30558, "table_properties": {"data_size": 94000, "index_size": 282, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6006, "raw_average_key_size": 20, "raw_value_size": 89689, "raw_average_value_size": 307, "num_data_blocks": 12, "num_entries": 292, "num_filter_entries": 292, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840792, "oldest_key_time": 1771840792, "file_creation_time": 1771840797, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}} Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 3098 microseconds, and 1198 cpu microseconds. Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.182241) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 96107 bytes OK Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.182262) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.184126) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.184147) EVENT_LOG_v1 {"time_micros": 1771840797184141, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.184169) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 95311, prev total WAL file size 95635, number of live WAL files 2. Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.187026) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303033' seq:72057594037927935, type:22 .. '6D6772737461740034323534' seq:0, type:0; will stop at (end) Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(93KB)], [54(16MB)] Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797187071, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 17518968, "oldest_snapshot_seqno": -1} Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 12613 keys, 15414183 bytes, temperature: kUnknown Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797287572, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 15414183, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15345078, "index_size": 36492, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31557, "raw_key_size": 339949, "raw_average_key_size": 26, "raw_value_size": 15132880, "raw_average_value_size": 1199, "num_data_blocks": 1365, "num_entries": 12613, "num_filter_entries": 12613, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840797, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}} Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.287963) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 15414183 bytes Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.289823) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.2 rd, 153.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 16.6 +0.0 blob) out(14.7 +0.0 blob), read-write-amplify(342.7) write-amplify(160.4) OK, records in: 13125, records dropped: 512 output_compression: NoCompression Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.289863) EVENT_LOG_v1 {"time_micros": 1771840797289849, "job": 32, "event": "compaction_finished", "compaction_time_micros": 100590, "compaction_time_cpu_micros": 46750, "output_level": 6, "num_output_files": 1, "total_output_size": 15414183, "num_input_records": 13125, "num_output_records": 12613, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797290075, "job": 32, "event": "table_file_deletion", "file_number": 56} Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840797292423, "job": 32, "event": "table_file_deletion", "file_number": 54} Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.186943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292573) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292588) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:57 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-09:59:57.292591) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 04:59:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e166 do_prune osdmap full prune enabled Feb 23 04:59:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e167 e167: 6 total, 6 up, 6 in Feb 23 04:59:57 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e167: 6 total, 6 up, 6 in Feb 23 04:59:57 localhost neutron_sriov_agent[258207]: 2026-02-23 09:59:57.935 2 INFO neutron.agent.securitygroups_rpc [None req-d119b339-8a48-47e8-bd68-4b4fb753cf46 92730c8dc08c46ec9f30a1ded731d654 b7501fe3a8904b43b875ec99452354a0 - - default default] Security group rule updated ['9373b311-126d-4dcc-ae54-c4b5d87c2dd5']#033[00m Feb 23 04:59:58 localhost neutron_dhcp_agent[265537]: 2026-02-23 09:59:58.331 265541 INFO neutron.agent.linux.ip_lib [None req-f758b58b-9cee-4d2a-8524-163310e3b69a - - - - - -] Device tap5dac6bd1-d4 cannot be used as it has no MAC address#033[00m Feb 23 04:59:58 localhost nova_compute[282206]: 2026-02-23 09:59:58.403 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:58 localhost kernel: device tap5dac6bd1-d4 entered promiscuous mode Feb 23 04:59:58 localhost NetworkManager[5974]: [1771840798.4100] manager: (tap5dac6bd1-d4): new Generic device (/org/freedesktop/NetworkManager/Devices/49) Feb 23 04:59:58 localhost nova_compute[282206]: 2026-02-23 09:59:58.410 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:58 localhost ovn_controller[157695]: 2026-02-23T09:59:58Z|00296|binding|INFO|Claiming lport 5dac6bd1-d4a3-4582-9c29-07477bd4a21f for this chassis. Feb 23 04:59:58 localhost ovn_controller[157695]: 2026-02-23T09:59:58Z|00297|binding|INFO|5dac6bd1-d4a3-4582-9c29-07477bd4a21f: Claiming unknown Feb 23 04:59:58 localhost systemd-udevd[318026]: Network interface NamePolicy= disabled on kernel command line. Feb 23 04:59:58 localhost nova_compute[282206]: 2026-02-23 09:59:58.422 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:58.434 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa5daa8e-ec49-444c-ae76-ef0b7a442dc0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5dac6bd1-d4a3-4582-9c29-07477bd4a21f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:58.436 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 5dac6bd1-d4a3-4582-9c29-07477bd4a21f in datapath 1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451 bound to our chassis#033[00m Feb 23 04:59:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:58.438 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:59:58 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:58.439 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[52ac14c7-0f05-45b7-9c72-1d47bd56f871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:58 localhost journal[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device Feb 23 04:59:58 localhost ovn_controller[157695]: 2026-02-23T09:59:58Z|00298|binding|INFO|Setting lport 5dac6bd1-d4a3-4582-9c29-07477bd4a21f ovn-installed in OVS Feb 23 04:59:58 localhost ovn_controller[157695]: 2026-02-23T09:59:58Z|00299|binding|INFO|Setting lport 5dac6bd1-d4a3-4582-9c29-07477bd4a21f up in Southbound Feb 23 04:59:58 localhost nova_compute[282206]: 2026-02-23 09:59:58.444 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:58 localhost nova_compute[282206]: 2026-02-23 09:59:58.448 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:58 localhost journal[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device Feb 23 04:59:58 localhost journal[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device Feb 23 04:59:58 localhost journal[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device Feb 23 04:59:58 localhost journal[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device Feb 23 04:59:58 localhost journal[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device Feb 23 04:59:58 localhost journal[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device Feb 23 04:59:58 localhost journal[231253]: ethtool ioctl error on tap5dac6bd1-d4: No such device Feb 23 04:59:58 localhost nova_compute[282206]: 2026-02-23 09:59:58.488 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:58 localhost nova_compute[282206]: 2026-02-23 09:59:58.514 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:59 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:59.596 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 10a3eba3-46b8-4384-8d8b-3be8944e41b3 with type ""#033[00m Feb 23 04:59:59 localhost ovn_controller[157695]: 2026-02-23T09:59:59Z|00300|binding|INFO|Removing iface tap5dac6bd1-d4 ovn-installed in OVS Feb 23 04:59:59 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:59.598 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa5daa8e-ec49-444c-ae76-ef0b7a442dc0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5dac6bd1-d4a3-4582-9c29-07477bd4a21f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 04:59:59 localhost ovn_controller[157695]: 2026-02-23T09:59:59Z|00301|binding|INFO|Removing lport 5dac6bd1-d4a3-4582-9c29-07477bd4a21f ovn-installed in OVS Feb 23 04:59:59 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:59.600 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 5dac6bd1-d4a3-4582-9c29-07477bd4a21f in datapath 1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451 unbound from our chassis#033[00m Feb 23 04:59:59 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:59.601 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 04:59:59 localhost ovn_controller[157695]: 2026-02-23T09:59:59Z|00302|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 04:59:59 localhost nova_compute[282206]: 2026-02-23 09:59:59.943 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:59 localhost ovn_metadata_agent[163567]: 2026-02-23 09:59:59.945 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b86c58f7-8669-450c-8522-5db750187515]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 04:59:59 localhost nova_compute[282206]: 2026-02-23 09:59:59.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 04:59:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 04:59:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:00:00 localhost ceph-mon[294160]: log_channel(cluster) log [INF] : overall HEALTH_OK Feb 23 05:00:00 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:00.235 2 INFO neutron.agent.securitygroups_rpc [None req-d445cf08-2b11-449c-8097-70ad611d5c5d 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 05:00:00 localhost podman[318116]: Feb 23 05:00:00 localhost podman[318090]: 2026-02-23 10:00:00.263057655 +0000 UTC m=+0.296420640 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 05:00:00 localhost podman[318100]: 2026-02-23 10:00:00.344958527 +0000 UTC m=+0.344493149 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.expose-services=) Feb 23 05:00:00 localhost podman[318116]: 2026-02-23 10:00:00.276644052 +0000 UTC m=+0.250661513 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:00:00 localhost podman[318116]: 2026-02-23 10:00:00.373511915 +0000 UTC m=+0.347529336 container create 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:00:00 localhost podman[318100]: 2026-02-23 10:00:00.390499138 +0000 UTC m=+0.390033760 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1770267347, io.openshift.tags=minimal rhel9, io.openshift.expose-services=) Feb 23 05:00:00 localhost podman[318090]: 2026-02-23 10:00:00.397613121 +0000 UTC m=+0.430976146 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 05:00:00 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:00:00 localhost systemd[1]: Started libpod-conmon-5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913.scope. Feb 23 05:00:00 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:00:00 localhost systemd[1]: Started libcrun container. Feb 23 05:00:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01d3abec598585e6b85eea6b9d1d90a4888675fb80fe787c6c9e936cebc8d754/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:00:00 localhost podman[318116]: 2026-02-23 10:00:00.461553879 +0000 UTC m=+0.435571330 container init 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 05:00:00 localhost podman[318116]: 2026-02-23 10:00:00.470357276 +0000 UTC m=+0.444374707 container start 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 23 05:00:00 localhost dnsmasq[318156]: started, version 2.85 cachesize 150 Feb 23 05:00:00 localhost dnsmasq[318156]: DNS service limited to local subnets Feb 23 05:00:00 localhost dnsmasq[318156]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:00:00 localhost dnsmasq[318156]: warning: no upstream servers configured Feb 23 05:00:00 localhost dnsmasq-dhcp[318156]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 05:00:00 localhost dnsmasq[318156]: read /var/lib/neutron/dhcp/1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451/addn_hosts - 0 addresses Feb 23 05:00:00 localhost dnsmasq-dhcp[318156]: read /var/lib/neutron/dhcp/1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451/host Feb 23 05:00:00 localhost dnsmasq-dhcp[318156]: read /var/lib/neutron/dhcp/1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451/opts Feb 23 05:00:00 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:00.638 265541 INFO neutron.agent.dhcp.agent [None req-e0e9c5f7-ced6-4771-bf8f-3b9fe0e5d235 - - - - - -] DHCP configuration for ports {'8327b9e9-ba3e-419e-9c3e-f43bbcff0dd0'} is completed#033[00m Feb 23 05:00:00 localhost dnsmasq[318156]: exiting on receipt of SIGTERM Feb 23 05:00:00 localhost podman[318174]: 2026-02-23 10:00:00.708626179 +0000 UTC m=+0.060711448 container kill 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:00:00 localhost systemd[1]: libpod-5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913.scope: Deactivated successfully. Feb 23 05:00:00 localhost podman[318187]: 2026-02-23 10:00:00.780681232 +0000 UTC m=+0.054751891 container died 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 05:00:00 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:00.807 2 INFO neutron.agent.securitygroups_rpc [None req-f716d86f-e353-494e-a028-e9ccf623762c 982b83c89c37422a910f5359ef7b6ea5 5bed3d4ae9fd4fe7b9440b4587246c14 - - default default] Security group member updated ['3f91b09d-b6ac-403f-adaf-7a684ee36fe5']#033[00m Feb 23 05:00:00 localhost podman[318187]: 2026-02-23 10:00:00.86402295 +0000 UTC m=+0.138093559 container cleanup 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 05:00:00 localhost systemd[1]: libpod-conmon-5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913.scope: Deactivated successfully. Feb 23 05:00:00 localhost podman[318188]: 2026-02-23 10:00:00.90447541 +0000 UTC m=+0.172859720 container remove 5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1d4e57e1-6bfe-49bd-917a-f0bd4f0ec451, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 05:00:00 localhost kernel: device tap5dac6bd1-d4 left promiscuous mode Feb 23 05:00:00 localhost nova_compute[282206]: 2026-02-23 10:00:00.953 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:00 localhost nova_compute[282206]: 2026-02-23 10:00:00.965 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:01 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:01.000 265541 INFO neutron.agent.dhcp.agent [None req-2051c492-3c01-4c11-90ee-a3bfdd2c9f7f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:01 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:01.001 265541 INFO neutron.agent.dhcp.agent [None req-2051c492-3c01-4c11-90ee-a3bfdd2c9f7f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:01 localhost systemd[1]: var-lib-containers-storage-overlay-01d3abec598585e6b85eea6b9d1d90a4888675fb80fe787c6c9e936cebc8d754-merged.mount: Deactivated successfully. Feb 23 05:00:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d2acad13ee6a591c5fdcbdc19983bbfb59f65ec6dad2fa886ed7b4890b56913-userdata-shm.mount: Deactivated successfully. Feb 23 05:00:01 localhost systemd[1]: run-netns-qdhcp\x2d1d4e57e1\x2d6bfe\x2d49bd\x2d917a\x2df0bd4f0ec451.mount: Deactivated successfully. Feb 23 05:00:01 localhost ceph-mon[294160]: overall HEALTH_OK Feb 23 05:00:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e167 do_prune osdmap full prune enabled Feb 23 05:00:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e168 e168: 6 total, 6 up, 6 in Feb 23 05:00:01 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e168: 6 total, 6 up, 6 in Feb 23 05:00:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e168 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 05:00:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e168 do_prune osdmap full prune enabled Feb 23 05:00:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e169 e169: 6 total, 6 up, 6 in Feb 23 05:00:02 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e169: 6 total, 6 up, 6 in Feb 23 05:00:03 localhost nova_compute[282206]: 2026-02-23 10:00:03.491 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e169 do_prune osdmap full prune enabled Feb 23 05:00:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e170 e170: 6 total, 6 up, 6 in Feb 23 05:00:04 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e170: 6 total, 6 up, 6 in Feb 23 05:00:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:04 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:05 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e46: np0005626465.hlpkwo(active, since 9m), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 05:00:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Feb 23 05:00:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e170 do_prune osdmap full prune enabled Feb 23 05:00:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e171 e171: 6 total, 6 up, 6 in Feb 23 05:00:07 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e171: 6 total, 6 up, 6 in Feb 23 05:00:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e171 do_prune osdmap full prune enabled Feb 23 05:00:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e172 e172: 6 total, 6 up, 6 in Feb 23 05:00:08 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e172: 6 total, 6 up, 6 in Feb 23 05:00:08 localhost nova_compute[282206]: 2026-02-23 10:00:08.494 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:00:08 localhost nova_compute[282206]: 2026-02-23 10:00:08.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:08 localhost nova_compute[282206]: 2026-02-23 10:00:08.495 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:00:08 localhost nova_compute[282206]: 2026-02-23 10:00:08.496 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:08 localhost nova_compute[282206]: 2026-02-23 10:00:08.496 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:08 localhost nova_compute[282206]: 2026-02-23 10:00:08.498 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:09 localhost podman[242954]: time="2026-02-23T10:00:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:00:09 localhost podman[242954]: @ - - [23/Feb/2026:10:00:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160733 "" "Go-http-client/1.1" Feb 23 05:00:09 localhost podman[242954]: @ - - [23/Feb/2026:10:00:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19764 "" "Go-http-client/1.1" Feb 23 05:00:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e172 do_prune osdmap full prune enabled Feb 23 05:00:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e173 e173: 6 total, 6 up, 6 in Feb 23 05:00:10 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e173: 6 total, 6 up, 6 in Feb 23 05:00:10 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Feb 23 05:00:11 localhost nova_compute[282206]: 2026-02-23 10:00:11.169 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:11 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:11.169 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:00:11 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:11.173 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:00:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:00:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:00:11 localhost podman[318217]: 2026-02-23 10:00:11.504620317 +0000 UTC m=+0.107304441 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller) Feb 23 05:00:11 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:11.515 265541 INFO neutron.agent.linux.ip_lib [None req-3c25de25-c79a-40f5-a7f7-41e6e4684a57 - - - - - -] Device tapa44c3247-d9 cannot be used as it has no MAC address#033[00m Feb 23 05:00:11 localhost podman[318218]: 2026-02-23 10:00:11.555886647 +0000 UTC m=+0.158592902 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 05:00:11 localhost nova_compute[282206]: 2026-02-23 10:00:11.559 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:11 localhost kernel: device tapa44c3247-d9 entered promiscuous mode Feb 23 05:00:11 localhost podman[318218]: 2026-02-23 10:00:11.568485743 +0000 UTC m=+0.171192068 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:00:11 localhost NetworkManager[5974]: [1771840811.5695] manager: (tapa44c3247-d9): new Generic device (/org/freedesktop/NetworkManager/Devices/50) Feb 23 05:00:11 localhost ovn_controller[157695]: 2026-02-23T10:00:11Z|00303|binding|INFO|Claiming lport a44c3247-d9d2-43bf-89f0-20a41979b22d for this chassis. Feb 23 05:00:11 localhost ovn_controller[157695]: 2026-02-23T10:00:11Z|00304|binding|INFO|a44c3247-d9d2-43bf-89f0-20a41979b22d: Claiming unknown Feb 23 05:00:11 localhost nova_compute[282206]: 2026-02-23 10:00:11.570 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:11 localhost systemd-udevd[318272]: Network interface NamePolicy= disabled on kernel command line. Feb 23 05:00:11 localhost podman[318217]: 2026-02-23 10:00:11.582574995 +0000 UTC m=+0.185259129 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:00:11 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:11.581 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-9cf7565c-6c34-4032-874f-5045764c2d40', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf7565c-6c34-4032-874f-5045764c2d40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fc70b01-d165-4d08-954b-a4aac704fc26, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a44c3247-d9d2-43bf-89f0-20a41979b22d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:00:11 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:11.586 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a44c3247-d9d2-43bf-89f0-20a41979b22d in datapath 9cf7565c-6c34-4032-874f-5045764c2d40 bound to our chassis#033[00m Feb 23 05:00:11 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:11.592 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9cf7565c-6c34-4032-874f-5045764c2d40 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:00:11 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:11.594 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[ac4895ad-5821-47a7-9667-4a531ddc0105]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:00:11 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:00:11 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:00:11 localhost journal[231253]: ethtool ioctl error on tapa44c3247-d9: No such device Feb 23 05:00:11 localhost ovn_controller[157695]: 2026-02-23T10:00:11Z|00305|binding|INFO|Setting lport a44c3247-d9d2-43bf-89f0-20a41979b22d ovn-installed in OVS Feb 23 05:00:11 localhost ovn_controller[157695]: 2026-02-23T10:00:11Z|00306|binding|INFO|Setting lport a44c3247-d9d2-43bf-89f0-20a41979b22d up in Southbound Feb 23 05:00:11 localhost journal[231253]: ethtool ioctl error on tapa44c3247-d9: No such device Feb 23 05:00:11 localhost nova_compute[282206]: 2026-02-23 10:00:11.621 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:11 localhost journal[231253]: ethtool ioctl error on tapa44c3247-d9: No such device Feb 23 05:00:11 localhost journal[231253]: ethtool ioctl error on tapa44c3247-d9: No such device Feb 23 05:00:11 localhost journal[231253]: ethtool ioctl error on tapa44c3247-d9: No such device Feb 23 05:00:11 localhost journal[231253]: ethtool ioctl error on tapa44c3247-d9: No such device Feb 23 05:00:11 localhost journal[231253]: ethtool ioctl error on tapa44c3247-d9: No such device Feb 23 05:00:11 localhost journal[231253]: ethtool ioctl error on tapa44c3247-d9: No such device Feb 23 05:00:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e173 do_prune osdmap full prune enabled Feb 23 05:00:11 localhost nova_compute[282206]: 2026-02-23 10:00:11.669 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e174 e174: 6 total, 6 up, 6 in Feb 23 05:00:11 localhost nova_compute[282206]: 2026-02-23 10:00:11.756 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e47: np0005626465.hlpkwo(active, since 9m), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 05:00:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e174: 6 total, 6 up, 6 in Feb 23 05:00:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e174 do_prune osdmap full prune enabled Feb 23 05:00:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e175 e175: 6 total, 6 up, 6 in Feb 23 05:00:12 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e175: 6 total, 6 up, 6 in Feb 23 05:00:12 localhost ovn_controller[157695]: 2026-02-23T10:00:12Z|00307|binding|INFO|Removing iface tapa44c3247-d9 ovn-installed in OVS Feb 23 05:00:12 localhost ovn_controller[157695]: 2026-02-23T10:00:12Z|00308|binding|INFO|Removing lport a44c3247-d9d2-43bf-89f0-20a41979b22d ovn-installed in OVS Feb 23 05:00:12 localhost nova_compute[282206]: 2026-02-23 10:00:12.386 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:12 localhost nova_compute[282206]: 2026-02-23 10:00:12.395 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:12 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:12.384 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port ecdcc65f-a986-4b6d-a221-9e63ddf8c10a with type ""#033[00m Feb 23 05:00:12 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:12.396 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-9cf7565c-6c34-4032-874f-5045764c2d40', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9cf7565c-6c34-4032-874f-5045764c2d40', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fc70b01-d165-4d08-954b-a4aac704fc26, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a44c3247-d9d2-43bf-89f0-20a41979b22d) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:00:12 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:12.402 163572 INFO neutron.agent.ovn.metadata.agent [-] Port a44c3247-d9d2-43bf-89f0-20a41979b22d in datapath 9cf7565c-6c34-4032-874f-5045764c2d40 unbound from our chassis#033[00m Feb 23 05:00:12 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:12.406 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9cf7565c-6c34-4032-874f-5045764c2d40 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:00:12 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:12.407 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[d10e106e-d21a-40f8-b8fb-1670a2dac261]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:00:12 localhost ovn_controller[157695]: 2026-02-23T10:00:12Z|00309|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:00:12 localhost nova_compute[282206]: 2026-02-23 10:00:12.609 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:12 localhost podman[318343]: Feb 23 05:00:12 localhost podman[318343]: 2026-02-23 10:00:12.708439274 +0000 UTC m=+0.089403299 container create 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0) Feb 23 05:00:12 localhost podman[318343]: 2026-02-23 10:00:12.661946813 +0000 UTC m=+0.042910878 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:00:12 localhost systemd[1]: Started libpod-conmon-1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2.scope. Feb 23 05:00:12 localhost systemd[1]: Started libcrun container. Feb 23 05:00:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a31478153c0cdc9bae681bf35ede0b8adfe37bddc02d2a887055da94332473a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:00:12 localhost podman[318343]: 2026-02-23 10:00:12.799378339 +0000 UTC m=+0.180342364 container init 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0) Feb 23 05:00:12 localhost podman[318343]: 2026-02-23 10:00:12.808661031 +0000 UTC m=+0.189625046 container start 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS) Feb 23 05:00:12 localhost dnsmasq[318360]: started, version 2.85 cachesize 150 Feb 23 05:00:12 localhost dnsmasq[318360]: DNS service limited to local subnets Feb 23 05:00:12 localhost dnsmasq[318360]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:00:12 localhost dnsmasq[318360]: warning: no upstream servers configured Feb 23 05:00:12 localhost dnsmasq-dhcp[318360]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 05:00:12 localhost dnsmasq[318360]: read /var/lib/neutron/dhcp/9cf7565c-6c34-4032-874f-5045764c2d40/addn_hosts - 0 addresses Feb 23 05:00:12 localhost dnsmasq-dhcp[318360]: read /var/lib/neutron/dhcp/9cf7565c-6c34-4032-874f-5045764c2d40/host Feb 23 05:00:12 localhost dnsmasq-dhcp[318360]: read /var/lib/neutron/dhcp/9cf7565c-6c34-4032-874f-5045764c2d40/opts Feb 23 05:00:12 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:12.900 265541 INFO neutron.agent.dhcp.agent [None req-bb1cc315-296f-4d7a-b16b-9b377c3be573 - - - - - -] DHCP configuration for ports {'4f81ba9a-180b-410b-a42c-37e692c5652b'} is completed#033[00m Feb 23 05:00:13 localhost dnsmasq[318360]: exiting on receipt of SIGTERM Feb 23 05:00:13 localhost podman[318377]: 2026-02-23 10:00:13.026535293 +0000 UTC m=+0.058701314 container kill 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:00:13 localhost systemd[1]: libpod-1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2.scope: Deactivated successfully. Feb 23 05:00:13 localhost podman[318390]: 2026-02-23 10:00:13.085050691 +0000 UTC m=+0.046842892 container died 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 05:00:13 localhost podman[318390]: 2026-02-23 10:00:13.16522897 +0000 UTC m=+0.127021141 container cleanup 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 05:00:13 localhost systemd[1]: libpod-conmon-1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2.scope: Deactivated successfully. Feb 23 05:00:13 localhost podman[318397]: 2026-02-23 10:00:13.223571572 +0000 UTC m=+0.166346476 container remove 1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9cf7565c-6c34-4032-874f-5045764c2d40, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:00:13 localhost nova_compute[282206]: 2026-02-23 10:00:13.238 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:13 localhost kernel: device tapa44c3247-d9 left promiscuous mode Feb 23 05:00:13 localhost nova_compute[282206]: 2026-02-23 10:00:13.257 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:13 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:13.282 265541 INFO neutron.agent.dhcp.agent [None req-63f71cf9-e17c-4ac8-8877-bfe3b1ae197b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:13 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:13.283 265541 INFO neutron.agent.dhcp.agent [None req-63f71cf9-e17c-4ac8-8877-bfe3b1ae197b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:13 localhost openstack_network_exporter[245358]: ERROR 10:00:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:00:13 localhost openstack_network_exporter[245358]: Feb 23 05:00:13 localhost openstack_network_exporter[245358]: ERROR 10:00:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:00:13 localhost openstack_network_exporter[245358]: Feb 23 05:00:13 localhost nova_compute[282206]: 2026-02-23 10:00:13.496 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:13 localhost systemd[1]: var-lib-containers-storage-overlay-3a31478153c0cdc9bae681bf35ede0b8adfe37bddc02d2a887055da94332473a-merged.mount: Deactivated successfully. Feb 23 05:00:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1af71bd71e33e7dc3032bc4dd17ddb84dae489562cbc7bff71efd680a1110ab2-userdata-shm.mount: Deactivated successfully. Feb 23 05:00:13 localhost systemd[1]: run-netns-qdhcp\x2d9cf7565c\x2d6c34\x2d4032\x2d874f\x2d5045764c2d40.mount: Deactivated successfully. Feb 23 05:00:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:13 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:14 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:14.176 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:00:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e175 do_prune osdmap full prune enabled Feb 23 05:00:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e176 e176: 6 total, 6 up, 6 in Feb 23 05:00:14 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e176: 6 total, 6 up, 6 in Feb 23 05:00:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e176 do_prune osdmap full prune enabled Feb 23 05:00:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e177 e177: 6 total, 6 up, 6 in Feb 23 05:00:15 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e177: 6 total, 6 up, 6 in Feb 23 05:00:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:00:16 localhost systemd[1]: tmp-crun.C2FKdt.mount: Deactivated successfully. Feb 23 05:00:16 localhost podman[318420]: 2026-02-23 10:00:16.922395288 +0000 UTC m=+0.095747418 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, org.label-schema.build-date=20260216, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2) Feb 23 05:00:16 localhost podman[318420]: 2026-02-23 10:00:16.937358818 +0000 UTC m=+0.110710998 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 05:00:16 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:00:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e177 do_prune osdmap full prune enabled Feb 23 05:00:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e178 e178: 6 total, 6 up, 6 in Feb 23 05:00:17 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e178: 6 total, 6 up, 6 in Feb 23 05:00:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:18 localhost nova_compute[282206]: 2026-02-23 10:00:18.500 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:00:18 localhost podman[318441]: 2026-02-23 10:00:18.914793021 +0000 UTC m=+0.086708934 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent, io.buildah.version=1.43.0) Feb 23 05:00:18 localhost podman[318441]: 2026-02-23 10:00:18.920997396 +0000 UTC m=+0.092913319 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 05:00:18 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:00:19 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e178 do_prune osdmap full prune enabled Feb 23 05:00:19 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e179 e179: 6 total, 6 up, 6 in Feb 23 05:00:19 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e179: 6 total, 6 up, 6 in Feb 23 05:00:19 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:19.947 2 INFO neutron.agent.securitygroups_rpc [req-eeab04f9-52db-422e-9728-ee390003483c req-72446ef4-1fda-4ad3-9026-f16aa7abb40d f49fd8b6937445efab40892d03b375d7 0421515e6bb54dea8db3ed218999e195 - - default default] Security group member updated ['c46df023-9a3e-4c54-a0bb-44b675220af4']#033[00m Feb 23 05:00:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:20 localhost dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 1 addresses Feb 23 05:00:20 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host Feb 23 05:00:20 localhost systemd[1]: tmp-crun.KKj4Ya.mount: Deactivated successfully. Feb 23 05:00:20 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts Feb 23 05:00:20 localhost podman[318478]: 2026-02-23 10:00:20.253743592 +0000 UTC m=+0.076968428 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2) Feb 23 05:00:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e179 do_prune osdmap full prune enabled Feb 23 05:00:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e180 e180: 6 total, 6 up, 6 in Feb 23 05:00:21 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e180: 6 total, 6 up, 6 in Feb 23 05:00:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e180 do_prune osdmap full prune enabled Feb 23 05:00:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e181 e181: 6 total, 6 up, 6 in Feb 23 05:00:22 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e181: 6 total, 6 up, 6 in Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.084 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.085 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.085 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:00:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e181 do_prune osdmap full prune enabled Feb 23 05:00:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e182 e182: 6 total, 6 up, 6 in Feb 23 05:00:23 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e182: 6 total, 6 up, 6 in Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.481 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.483 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.483 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.484 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.501 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.503 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.504 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.504 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.505 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:23 localhost nova_compute[282206]: 2026-02-23 10:00:23.509 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:00:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:00:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:00:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1428402500' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:00:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:00:23 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1428402500' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:00:24 localhost nova_compute[282206]: 2026-02-23 10:00:24.039 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:00:24 localhost nova_compute[282206]: 2026-02-23 10:00:24.066 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:00:24 localhost nova_compute[282206]: 2026-02-23 10:00:24.067 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:00:24 localhost nova_compute[282206]: 2026-02-23 10:00:24.068 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:24 localhost nova_compute[282206]: 2026-02-23 10:00:24.068 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:00:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e182 do_prune osdmap full prune enabled Feb 23 05:00:24 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:00:24 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:00:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e183 e183: 6 total, 6 up, 6 in Feb 23 05:00:24 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e183: 6 total, 6 up, 6 in Feb 23 05:00:24 localhost sshd[318585]: main: sshd: ssh-rsa algorithm is disabled Feb 23 05:00:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e183 do_prune osdmap full prune enabled Feb 23 05:00:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e184 e184: 6 total, 6 up, 6 in Feb 23 05:00:25 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e184: 6 total, 6 up, 6 in Feb 23 05:00:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:00:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:00:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:00:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4221272771' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:00:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:00:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4221272771' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:00:26 localhost ovn_controller[157695]: 2026-02-23T10:00:26Z|00310|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:00:26 localhost nova_compute[282206]: 2026-02-23 10:00:26.207 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:00:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e184 do_prune osdmap full prune enabled Feb 23 05:00:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e185 e185: 6 total, 6 up, 6 in Feb 23 05:00:26 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e185: 6 total, 6 up, 6 in Feb 23 05:00:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:27 localhost dnsmasq[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/addn_hosts - 0 addresses Feb 23 05:00:27 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/host Feb 23 05:00:27 localhost dnsmasq-dhcp[316315]: read /var/lib/neutron/dhcp/ff7aa220-5765-44c6-9121-cfbd718241c5/opts Feb 23 05:00:27 localhost podman[318604]: 2026-02-23 10:00:27.146414731 +0000 UTC m=+0.065735325 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:00:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:27 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:27 localhost ovn_controller[157695]: 2026-02-23T10:00:27Z|00311|binding|INFO|Releasing lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 from this chassis (sb_readonly=0) Feb 23 05:00:27 localhost ovn_controller[157695]: 2026-02-23T10:00:27Z|00312|binding|INFO|Setting lport d5a42e1b-5089-41c4-9d02-d28b44b515d2 down in Southbound Feb 23 05:00:27 localhost kernel: device tapd5a42e1b-50 left promiscuous mode Feb 23 05:00:27 localhost nova_compute[282206]: 2026-02-23 10:00:27.427 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:27 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:27.433 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-ff7aa220-5765-44c6-9121-cfbd718241c5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff7aa220-5765-44c6-9121-cfbd718241c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0421515e6bb54dea8db3ed218999e195', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6cbbaed5-c16c-4b6f-96d8-1ef1b1b430f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d5a42e1b-5089-41c4-9d02-d28b44b515d2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:00:27 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:27.435 163572 INFO neutron.agent.ovn.metadata.agent [-] Port d5a42e1b-5089-41c4-9d02-d28b44b515d2 in datapath ff7aa220-5765-44c6-9121-cfbd718241c5 unbound from our chassis#033[00m Feb 23 05:00:27 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:27.438 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff7aa220-5765-44c6-9121-cfbd718241c5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 05:00:27 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:27.439 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[e96f8653-33e4-4200-a548-a6093f712819]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:00:27 localhost nova_compute[282206]: 2026-02-23 10:00:27.457 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.077 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.077 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.548 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:00:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1372422225' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.617 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.697 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.698 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.909 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.911 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11285MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.912 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:00:28 localhost nova_compute[282206]: 2026-02-23 10:00:28.912 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:00:28 localhost ovn_controller[157695]: 2026-02-23T10:00:28Z|00313|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:28.999 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.000 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.000 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.034 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.054 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:00:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3101926543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.544 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.551 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.607 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.610 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:00:29 localhost nova_compute[282206]: 2026-02-23 10:00:29.611 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.699s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:00:29 localhost dnsmasq[316315]: exiting on receipt of SIGTERM Feb 23 05:00:29 localhost podman[318687]: 2026-02-23 10:00:29.822715263 +0000 UTC m=+0.063255608 container kill e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:00:29 localhost systemd[1]: libpod-e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf.scope: Deactivated successfully. Feb 23 05:00:29 localhost podman[318700]: 2026-02-23 10:00:29.892601768 +0000 UTC m=+0.053429189 container died e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 05:00:29 localhost systemd[1]: tmp-crun.y0e47k.mount: Deactivated successfully. Feb 23 05:00:29 localhost podman[318700]: 2026-02-23 10:00:29.949089332 +0000 UTC m=+0.109916723 container cleanup e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:00:29 localhost systemd[1]: libpod-conmon-e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf.scope: Deactivated successfully. Feb 23 05:00:29 localhost podman[318702]: 2026-02-23 10:00:29.968862313 +0000 UTC m=+0.117901644 container remove e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ff7aa220-5765-44c6-9121-cfbd718241c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0) Feb 23 05:00:30 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:30.131 265541 INFO neutron.agent.dhcp.agent [None req-eee5d287-8dd8-4baa-a93d-7c177f39d304 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:30 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:30.512 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:30 localhost nova_compute[282206]: 2026-02-23 10:00:30.617 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:30 localhost nova_compute[282206]: 2026-02-23 10:00:30.618 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:30 localhost nova_compute[282206]: 2026-02-23 10:00:30.618 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:00:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:00:30 localhost systemd[1]: var-lib-containers-storage-overlay-0a4908f0bc9328306b516915e2de85425e40798a526663541b1416ff04dc528a-merged.mount: Deactivated successfully. Feb 23 05:00:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e66f962a68cc6113335b6f9dd3920b4164c15c9c93c83a854a3d14ce3266d7bf-userdata-shm.mount: Deactivated successfully. Feb 23 05:00:30 localhost systemd[1]: run-netns-qdhcp\x2dff7aa220\x2d5765\x2d44c6\x2d9121\x2dcfbd718241c5.mount: Deactivated successfully. Feb 23 05:00:30 localhost podman[318730]: 2026-02-23 10:00:30.933818198 +0000 UTC m=+0.093856168 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 05:00:30 localhost podman[318730]: 2026-02-23 10:00:30.969243361 +0000 UTC m=+0.129281301 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 05:00:30 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:00:30 localhost podman[318729]: 2026-02-23 10:00:30.986314677 +0000 UTC m=+0.148815544 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1770267347, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9) Feb 23 05:00:31 localhost podman[318729]: 2026-02-23 10:00:31.00231509 +0000 UTC m=+0.164816027 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=) Feb 23 05:00:31 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:31.010 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:31 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:00:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e185 do_prune osdmap full prune enabled Feb 23 05:00:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e186 e186: 6 total, 6 up, 6 in Feb 23 05:00:31 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e186: 6 total, 6 up, 6 in Feb 23 05:00:32 localhost nova_compute[282206]: 2026-02-23 10:00:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e186 do_prune osdmap full prune enabled Feb 23 05:00:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e187 e187: 6 total, 6 up, 6 in Feb 23 05:00:32 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e187: 6 total, 6 up, 6 in Feb 23 05:00:32 localhost dnsmasq[317885]: exiting on receipt of SIGTERM Feb 23 05:00:32 localhost podman[318787]: 2026-02-23 10:00:32.275615939 +0000 UTC m=+0.069966628 container kill 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216) Feb 23 05:00:32 localhost systemd[1]: libpod-993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a.scope: Deactivated successfully. Feb 23 05:00:32 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:32.289 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 034ef5b0-24bc-4eeb-b7eb-fa73747ebcf1 with type ""#033[00m Feb 23 05:00:32 localhost ovn_controller[157695]: 2026-02-23T10:00:32Z|00314|binding|INFO|Removing iface tapcf62e779-21 ovn-installed in OVS Feb 23 05:00:32 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:32.291 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.243/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-5e40f037-bddd-4e41-9358-072288273862', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e40f037-bddd-4e41-9358-072288273862', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '51dc993993124a3f926af711e8b0f088', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fd636fd1-c9ff-44d0-b6d5-3a4c5f8e69de, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cf62e779-21c9-44d1-992a-8d67e75ee9a4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:00:32 localhost ovn_controller[157695]: 2026-02-23T10:00:32Z|00315|binding|INFO|Removing lport cf62e779-21c9-44d1-992a-8d67e75ee9a4 ovn-installed in OVS Feb 23 05:00:32 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:32.301 163572 INFO neutron.agent.ovn.metadata.agent [-] Port cf62e779-21c9-44d1-992a-8d67e75ee9a4 in datapath 5e40f037-bddd-4e41-9358-072288273862 unbound from our chassis#033[00m Feb 23 05:00:32 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:32.303 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5e40f037-bddd-4e41-9358-072288273862, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 05:00:32 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:32.306 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[057df0c0-dd81-4f4e-8260-0ad6c8469059]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:00:32 localhost nova_compute[282206]: 2026-02-23 10:00:32.333 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:32 localhost podman[318802]: 2026-02-23 10:00:32.365514644 +0000 UTC m=+0.064824068 container died 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 23 05:00:32 localhost systemd[1]: tmp-crun.1lVIK3.mount: Deactivated successfully. Feb 23 05:00:32 localhost podman[318802]: 2026-02-23 10:00:32.41060603 +0000 UTC m=+0.109915414 container cleanup 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:00:32 localhost systemd[1]: libpod-conmon-993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a.scope: Deactivated successfully. Feb 23 05:00:32 localhost podman[318803]: 2026-02-23 10:00:32.454100045 +0000 UTC m=+0.144869041 container remove 993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5e40f037-bddd-4e41-9358-072288273862, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 05:00:32 localhost kernel: device tapcf62e779-21 left promiscuous mode Feb 23 05:00:32 localhost nova_compute[282206]: 2026-02-23 10:00:32.467 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:32 localhost nova_compute[282206]: 2026-02-23 10:00:32.483 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:32.504 265541 INFO neutron.agent.dhcp.agent [None req-81cab5d0-47b8-4830-8d57-c82a39e48f71 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:32.505 265541 INFO neutron.agent.dhcp.agent [None req-81cab5d0-47b8-4830-8d57-c82a39e48f71 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:00:32 localhost ovn_controller[157695]: 2026-02-23T10:00:32Z|00316|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:00:32 localhost nova_compute[282206]: 2026-02-23 10:00:32.687 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:33 localhost nova_compute[282206]: 2026-02-23 10:00:33.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:33 localhost systemd[1]: var-lib-containers-storage-overlay-db56fcd2726acc1983ef9844d0380fb1390c65eda950b0c88ea7f886d5e3e2ce-merged.mount: Deactivated successfully. Feb 23 05:00:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-993d8a17873435aed651c58621e43c8085b43d7880121cc4af8dde85919b951a-userdata-shm.mount: Deactivated successfully. Feb 23 05:00:33 localhost systemd[1]: run-netns-qdhcp\x2d5e40f037\x2dbddd\x2d4e41\x2d9358\x2d072288273862.mount: Deactivated successfully. Feb 23 05:00:33 localhost nova_compute[282206]: 2026-02-23 10:00:33.592 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e187 do_prune osdmap full prune enabled Feb 23 05:00:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e188 e188: 6 total, 6 up, 6 in Feb 23 05:00:33 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e188: 6 total, 6 up, 6 in Feb 23 05:00:35 localhost nova_compute[282206]: 2026-02-23 10:00:35.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:00:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:38 localhost nova_compute[282206]: 2026-02-23 10:00:38.595 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:00:38 localhost nova_compute[282206]: 2026-02-23 10:00:38.597 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:00:38 localhost nova_compute[282206]: 2026-02-23 10:00:38.597 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:00:38 localhost nova_compute[282206]: 2026-02-23 10:00:38.598 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:38 localhost nova_compute[282206]: 2026-02-23 10:00:38.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:38 localhost nova_compute[282206]: 2026-02-23 10:00:38.645 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:39 localhost podman[242954]: time="2026-02-23T10:00:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:00:39 localhost podman[242954]: @ - - [23/Feb/2026:10:00:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:00:39 localhost podman[242954]: @ - - [23/Feb/2026:10:00:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18817 "" "Go-http-client/1.1" Feb 23 05:00:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:40 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e188 do_prune osdmap full prune enabled Feb 23 05:00:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e189 e189: 6 total, 6 up, 6 in Feb 23 05:00:41 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e189: 6 total, 6 up, 6 in Feb 23 05:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:00:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:00:41 localhost podman[318835]: 2026-02-23 10:00:41.923893621 +0000 UTC m=+0.094441267 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0) Feb 23 05:00:41 localhost podman[318836]: 2026-02-23 10:00:41.891945157 +0000 UTC m=+0.064206128 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 05:00:41 localhost podman[318836]: 2026-02-23 10:00:41.974380156 +0000 UTC m=+0.146641107 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 05:00:41 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:00:42 localhost podman[318835]: 2026-02-23 10:00:42.026391819 +0000 UTC m=+0.196939505 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Feb 23 05:00:42 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:00:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e189 do_prune osdmap full prune enabled Feb 23 05:00:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e190 e190: 6 total, 6 up, 6 in Feb 23 05:00:42 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e190: 6 total, 6 up, 6 in Feb 23 05:00:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:00:42 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1664302141' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:00:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:00:42 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1664302141' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:00:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:00:42 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1142525926' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:00:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:00:42 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1142525926' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:00:43 localhost openstack_network_exporter[245358]: ERROR 10:00:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:00:43 localhost openstack_network_exporter[245358]: Feb 23 05:00:43 localhost openstack_network_exporter[245358]: ERROR 10:00:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:00:43 localhost openstack_network_exporter[245358]: Feb 23 05:00:43 localhost nova_compute[282206]: 2026-02-23 10:00:43.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:43 localhost nova_compute[282206]: 2026-02-23 10:00:43.647 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e190 do_prune osdmap full prune enabled Feb 23 05:00:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e191 e191: 6 total, 6 up, 6 in Feb 23 05:00:44 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e191: 6 total, 6 up, 6 in Feb 23 05:00:45 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:45.224 2 INFO neutron.agent.securitygroups_rpc [None req-07956afb-903c-4340-9a87-60ef781dc496 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e191 do_prune osdmap full prune enabled Feb 23 05:00:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e192 e192: 6 total, 6 up, 6 in Feb 23 05:00:46 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e192: 6 total, 6 up, 6 in Feb 23 05:00:46 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:46.612 2 INFO neutron.agent.securitygroups_rpc [None req-8a14fab2-7a8a-4c43-8296-30b056f5b06d 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:46 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:46.805 2 INFO neutron.agent.securitygroups_rpc [None req-8a14fab2-7a8a-4c43-8296-30b056f5b06d 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:47 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:47.061 2 INFO neutron.agent.securitygroups_rpc [None req-9c6cef5b-9665-4a72-8732-114020aa83fb 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e192 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:47 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:47.377 2 INFO neutron.agent.securitygroups_rpc [None req-0ce85d6b-73e9-4380-afd1-5f90a1fc10e8 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:00:47 localhost systemd[1]: tmp-crun.LDI31E.mount: Deactivated successfully. Feb 23 05:00:47 localhost podman[318883]: 2026-02-23 10:00:47.990808116 +0000 UTC m=+0.054979247 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 05:00:48 localhost podman[318883]: 2026-02-23 10:00:48.028069647 +0000 UTC m=+0.092240778 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 05:00:48 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:00:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e192 do_prune osdmap full prune enabled Feb 23 05:00:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e193 e193: 6 total, 6 up, 6 in Feb 23 05:00:48 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e193: 6 total, 6 up, 6 in Feb 23 05:00:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:48.560 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:00:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:00:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:00:48 localhost nova_compute[282206]: 2026-02-23 10:00:48.648 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:00:48 localhost nova_compute[282206]: 2026-02-23 10:00:48.650 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:00:48 localhost nova_compute[282206]: 2026-02-23 10:00:48.650 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:00:48 localhost nova_compute[282206]: 2026-02-23 10:00:48.650 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:48 localhost nova_compute[282206]: 2026-02-23 10:00:48.698 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:48 localhost nova_compute[282206]: 2026-02-23 10:00:48.699 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:48 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:49 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:49.480 2 INFO neutron.agent.securitygroups_rpc [None req-81c2d0d8-1692-4b29-b62c-eb471980e5c6 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e193 do_prune osdmap full prune enabled Feb 23 05:00:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e194 e194: 6 total, 6 up, 6 in Feb 23 05:00:49 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e194: 6 total, 6 up, 6 in Feb 23 05:00:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:00:49 localhost podman[318902]: 2026-02-23 10:00:49.889233778 +0000 UTC m=+0.071288060 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 05:00:49 localhost podman[318902]: 2026-02-23 10:00:49.923333009 +0000 UTC m=+0.105387221 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0) Feb 23 05:00:49 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:00:49 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:49.968 2 INFO neutron.agent.securitygroups_rpc [None req-502b98ac-7cd1-4314-8dc8-56f7a95bf1c0 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e194 do_prune osdmap full prune enabled Feb 23 05:00:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e195 e195: 6 total, 6 up, 6 in Feb 23 05:00:50 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e195: 6 total, 6 up, 6 in Feb 23 05:00:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e195 do_prune osdmap full prune enabled Feb 23 05:00:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e196 e196: 6 total, 6 up, 6 in Feb 23 05:00:51 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e196: 6 total, 6 up, 6 in Feb 23 05:00:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:00:52 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2581763312' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:00:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:00:52 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2581763312' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:00:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e196 do_prune osdmap full prune enabled Feb 23 05:00:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e197 e197: 6 total, 6 up, 6 in Feb 23 05:00:52 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e197: 6 total, 6 up, 6 in Feb 23 05:00:53 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:53.060 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:00:53 localhost nova_compute[282206]: 2026-02-23 10:00:53.061 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:53 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:53.062 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:00:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:00:53 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4075192615' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:00:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:00:53 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4075192615' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:00:53 localhost nova_compute[282206]: 2026-02-23 10:00:53.727 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e197 do_prune osdmap full prune enabled Feb 23 05:00:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e198 e198: 6 total, 6 up, 6 in Feb 23 05:00:53 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e198: 6 total, 6 up, 6 in Feb 23 05:00:55 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:55.203 2 INFO neutron.agent.securitygroups_rpc [None req-572fb810-ab54-44db-ac0d-47376cf59f66 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:55 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:55.534 2 INFO neutron.agent.securitygroups_rpc [None req-3cc00f74-b029-4176-993c-a46e50d01263 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:55 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:55.806 2 INFO neutron.agent.securitygroups_rpc [None req-252a7af6-1118-4c58-9e08-ff23cd8dfa0e 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e198 do_prune osdmap full prune enabled Feb 23 05:00:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e199 e199: 6 total, 6 up, 6 in Feb 23 05:00:55 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e199: 6 total, 6 up, 6 in Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.144 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.144 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.145 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.150 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6c2bf04-0278-461c-bb91-654fe331cfd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.145685', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bcc7ad6-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': 'b9d1c0e4136abdc77622d645b539f0f36fa30239089fab5d7cd7bf0ff34f84ff'}]}, 'timestamp': '2026-02-23 10:00:56.151037', '_unique_id': '890494c2c06e4b32a566b7bf0d8391d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.152 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.153 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.184 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.184 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2919e70-bc51-47e6-abb8-e9a2c13a2df1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.154106', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bd19b42-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': '64e32666570d1a70b77e755d421f11f235990abc064b62a393f5b33b6dd1a20a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.154106', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bd1aeca-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'e6f7596c86973b332d4bcf9fe678043ec7dd4a6120e4a6f838154d65a7fcd1ac'}]}, 'timestamp': '2026-02-23 10:00:56.185045', '_unique_id': 'ba5a1f0d936d4f0e8d1bc870b7a055c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.186 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.198 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.199 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '574409cf-3756-48f4-8f89-37fb5ffcd899', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.187517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bd3cf5c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': 'ff81cdebb056c454e1d81b627dbf357ca17a6933880a180167f02c48ac2f0269'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.187517', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bd3e21c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': '660aab5f87c6981cb852b808a5588ca349c074894f74d2cb3a0d0c6ccc88654e'}]}, 'timestamp': '2026-02-23 10:00:56.199429', '_unique_id': '5cc9d0188a52455082c2443381e8db29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.200 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.201 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b25202b-02fb-466b-8a7a-31eaae7ae990', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.201820', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd4522e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '7b951179ef1c2c56b59893a924acc75947cd295a8815aff05dd5660b151a61d2'}]}, 'timestamp': '2026-02-23 10:00:56.202325', '_unique_id': '5cf1d60bf175421691decc00b2cfa6d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.203 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.204 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca9164bb-9f13-4760-a464-96e566605d3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.204461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bd4b890-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'c27b44e6edd01925495145055a8961ea348f9ebe51f6b98d8888edb6adc28b0b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.204461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bd4ca38-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': '240cbc0a4b5f4ddd569d407e5f0d2b27d42917e1e80b36cfc033804e2e4b6f5c'}]}, 'timestamp': '2026-02-23 10:00:56.205370', '_unique_id': '07f1b551489b41e59060a2621f944d4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.206 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.207 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '510ebb4d-1488-4d16-bef4-91cd61d30e25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.207528', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd52f78-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '98832c87f1114055f456e289ec98e0dcc5ae1e152f81b61186c26f53da5b95f6'}]}, 'timestamp': '2026-02-23 10:00:56.208022', '_unique_id': '94da94fb8ad74c64abbccda810ed1258'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.208 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.210 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.225 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:56.227 2 INFO neutron.agent.securitygroups_rpc [None req-a3f5d0f4-f2a7-4483-8e6e-6de10e55779f 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d5bb7b6-f39f-42e2-b0ca-9a5406b9a8f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:00:56.210423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8bd7fece-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.415218954, 'message_signature': '2fc18f39bd6f4069fbbf2fda1d81c2e3d770d9d4b2649b1f47980e1eac884863'}]}, 'timestamp': '2026-02-23 10:00:56.226456', '_unique_id': 'd8922b6d7c2d453384ec095fe125e04d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.227 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81976945-ac22-4212-b322-aa792ab0eb15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.228944', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd87674-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '252f91ee4f0a0def57a500ef89082db05f4069ac8bd0aa789e41f374b6a5b4a5'}]}, 'timestamp': '2026-02-23 10:00:56.229479', '_unique_id': 'e4aaf7c904db44649624c575bcc4e2a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.230 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19ecc10f-a046-44b8-84b5-be0a2f62189a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.231766', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd8e41a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '2c48bd37bcef21e03711c8f528ed299645999707f331ae822649f5d0f41d61f0'}]}, 'timestamp': '2026-02-23 10:00:56.232320', '_unique_id': 'bf29c8ca251b42c8a631de29f98ad0f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.233 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebce52c2-e71d-4f9a-9f69-a4bb4e75c42f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.234608', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd9517a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': 'df229a6d991370a63362e21cebf9405fcccd314cc4ef71ce61ea41c3984c67e7'}]}, 'timestamp': '2026-02-23 10:00:56.235111', '_unique_id': '6c6ea3ee5add44dfbd1501632daf7fae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.236 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.237 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b85a616-f2f2-4569-88f0-2e9d04459e10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.237642', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bd9c9c0-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': 'd23444cd6267e5c01945efea793699e73348d25750e5360045c186ce091cf162'}]}, 'timestamp': '2026-02-23 10:00:56.238270', '_unique_id': '99d1c1e3a641443da5a805dcc10fd83c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9925f058-5d7d-4d2b-a038-a9e5aefa5e38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.240541', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bda3914-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'e363dca22e4214ef188a865592242a6d62c1e956a9421ad99e3f617c7ed7964a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.240541', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bda4c38-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'f32bb874f557b20fce9df58d14430c10d3f4d656db2bf6d73a93823344119ef4'}]}, 'timestamp': '2026-02-23 10:00:56.241466', '_unique_id': '1356d5e737e544f68a07527cfffef623'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cfcbf7d-0458-4a27-969a-51c3ba158713', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.243677', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bdab4a2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': 'fb4666d732d403a4e93e6b2ac0c906494e40042db1ffd38e8c376d9e1b32d060'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.243677', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bdac564-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': '0e4392b35193a69e77fe57787014433f6169c12a6739f880b545e5f74387e9bc'}]}, 'timestamp': '2026-02-23 10:00:56.244603', '_unique_id': '4e7bd622b2704465bec62e0ad46c69a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '234e5f17-4e0b-4cd2-9857-12b883fad38e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.246820', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bdb304e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '11e9acbbf9d78df5f34512e50cdfc64c88b40265f939c1c0949935201d4bc8ec'}]}, 'timestamp': '2026-02-23 10:00:56.247333', '_unique_id': '5ae68f0bd3ee4319a2821e1fdb57a53b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0268a5b-f6bc-4abb-87be-5efa26b7719d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.249497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bdb96ba-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': 'fd7b48ef33cd27821bb1f04c6859d50957645e733c6454646bbbae8140f2ecf2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.249497', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bdbb9ec-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.377011584, 'message_signature': '42a44054d1f2f31ca216287ae664c981d68b85a1b2713ca32fe7598a581d88f8'}]}, 'timestamp': '2026-02-23 10:00:56.250987', '_unique_id': '44c832d03cde41519216114953443909'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.255 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8c915f6-271f-4f62-b25d-fe1878574267', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.254354', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bdc5cbc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'b58f6c6dad5b5f0b81e96d71abf7c1b26ef39a7a8e5c330f6c64cc0dccb23638'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.254354', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bdc7314-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': '6af67ddb8142bc9e5d9f6087668dece677d81b662a5279e0cb343a73b5abb717'}]}, 'timestamp': '2026-02-23 10:00:56.255570', '_unique_id': '4dfd55337d7d43d2801a66c1efb9e4e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '928fcb0f-5d30-4efc-a22c-50532174a63f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.257860', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bdcdfe8-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': '5f484b8026f8e365cc860933081fd0f68a4c864d2e0eccea5876cf54d5a063f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.257860', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bdcf05a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'f6c383d15d83eacf8ee1fb1232909ea10ab7ebf6fe121c59038f5cc0288a0af4'}]}, 'timestamp': '2026-02-23 10:00:56.258773', '_unique_id': '773c8bec359746059b506a73e3994d72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.260 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79012cf4-6c87-46ba-abe7-e133783cc539', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.261726', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bdd7282-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': 'ef10eae0eecf80ea3fb5fdf78daee43ccb1da7dddc6f862aae8e5554e4d07ea4'}]}, 'timestamp': '2026-02-23 10:00:56.262056', '_unique_id': 'e7ec4a0b5972457c94069e9dd2dd8f15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b77ba3e-3eaa-4f4d-9767-ae43df8c186b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:00:56.263406', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '8bddb328-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.33519967, 'message_signature': '3bfb41ab7c6f7c250b7a3d127e379bdbc1631b9f3afbbf0045d55b6dd2f28ad6'}]}, 'timestamp': '2026-02-23 10:00:56.263700', '_unique_id': '98ddd7c9fb4c441697ab60042b043545'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.264 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 14880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2503d5c-75c0-414b-a7f3-fe6b2f0fdd30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14880000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:00:56.265110', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8bddf5c2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.415218954, 'message_signature': 'eaafd2f6f576736364ec8a7a8fce765efe1f135b96de1e17b837f9809aee04d6'}]}, 'timestamp': '2026-02-23 10:00:56.265398', '_unique_id': 'a3381aa984f94dfdb490bb0e2031f8ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17c7d105-fbdc-4cd5-b1dd-ad3a446b45bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:00:56.266740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8bde35aa-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'cb732127c5602dfbee01e679ca3b54e1211f3f01a9bec6ab287f532953430c84'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:00:56.266740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8bde3ffa-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12296.343602384, 'message_signature': 'd6f4928ec8eadbb5745bbcbecb368876c928a73e5373eb5496a322f41161e8c2'}]}, 'timestamp': '2026-02-23 10:00:56.267287', '_unique_id': '2d24a51e866a4f10962b05c2023f3bae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:00:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:00:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:00:56 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:56.569 2 INFO neutron.agent.securitygroups_rpc [None req-84c168ac-c1cc-4ad3-a429-c07c41b0ce10 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:57 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:57.207 2 INFO neutron.agent.securitygroups_rpc [None req-076cf072-6b56-42a9-a049-cb8eb1290757 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:00:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e199 do_prune osdmap full prune enabled Feb 23 05:00:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e200 e200: 6 total, 6 up, 6 in Feb 23 05:00:57 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e200: 6 total, 6 up, 6 in Feb 23 05:00:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:00:58 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:00:58 localhost nova_compute[282206]: 2026-02-23 10:00:58.728 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:00:58 localhost nova_compute[282206]: 2026-02-23 10:00:58.730 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:00:58 localhost nova_compute[282206]: 2026-02-23 10:00:58.730 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:00:58 localhost nova_compute[282206]: 2026-02-23 10:00:58.730 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:58 localhost nova_compute[282206]: 2026-02-23 10:00:58.759 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:58 localhost nova_compute[282206]: 2026-02-23 10:00:58.760 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:00:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e200 do_prune osdmap full prune enabled Feb 23 05:00:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e201 e201: 6 total, 6 up, 6 in Feb 23 05:00:58 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e201: 6 total, 6 up, 6 in Feb 23 05:00:59 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:00:59.127 265541 INFO neutron.agent.linux.ip_lib [None req-87a63f7f-d975-45c2-84d9-883b37d73f3d - - - - - -] Device tapca3573cf-58 cannot be used as it has no MAC address#033[00m Feb 23 05:00:59 localhost nova_compute[282206]: 2026-02-23 10:00:59.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:59 localhost kernel: device tapca3573cf-58 entered promiscuous mode Feb 23 05:00:59 localhost nova_compute[282206]: 2026-02-23 10:00:59.169 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:59 localhost ovn_controller[157695]: 2026-02-23T10:00:59Z|00317|binding|INFO|Claiming lport ca3573cf-58bc-4b24-8e6e-7f86bcaa638e for this chassis. Feb 23 05:00:59 localhost ovn_controller[157695]: 2026-02-23T10:00:59Z|00318|binding|INFO|ca3573cf-58bc-4b24-8e6e-7f86bcaa638e: Claiming unknown Feb 23 05:00:59 localhost NetworkManager[5974]: [1771840859.1761] manager: (tapca3573cf-58): new Generic device (/org/freedesktop/NetworkManager/Devices/51) Feb 23 05:00:59 localhost systemd-udevd[318932]: Network interface NamePolicy= disabled on kernel command line. Feb 23 05:00:59 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:59.182 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-9215e91e-1f4e-4608-9372-53243278a03d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9215e91e-1f4e-4608-9372-53243278a03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90343b3c0ce240adab2c21e5c92b6952', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9b465d1-55ea-4c9a-b8cb-0871f64f66d5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ca3573cf-58bc-4b24-8e6e-7f86bcaa638e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:00:59 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:59.184 163572 INFO neutron.agent.ovn.metadata.agent [-] Port ca3573cf-58bc-4b24-8e6e-7f86bcaa638e in datapath 9215e91e-1f4e-4608-9372-53243278a03d bound to our chassis#033[00m Feb 23 05:00:59 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:59.186 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9215e91e-1f4e-4608-9372-53243278a03d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:00:59 localhost ovn_metadata_agent[163567]: 2026-02-23 10:00:59.188 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[bf3931a9-9362-47b9-993a-8103855922df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:00:59 localhost ovn_controller[157695]: 2026-02-23T10:00:59Z|00319|binding|INFO|Setting lport ca3573cf-58bc-4b24-8e6e-7f86bcaa638e ovn-installed in OVS Feb 23 05:00:59 localhost ovn_controller[157695]: 2026-02-23T10:00:59Z|00320|binding|INFO|Setting lport ca3573cf-58bc-4b24-8e6e-7f86bcaa638e up in Southbound Feb 23 05:00:59 localhost nova_compute[282206]: 2026-02-23 10:00:59.210 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:59 localhost nova_compute[282206]: 2026-02-23 10:00:59.259 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:59 localhost nova_compute[282206]: 2026-02-23 10:00:59.303 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:00:59 localhost neutron_sriov_agent[258207]: 2026-02-23 10:00:59.724 2 INFO neutron.agent.securitygroups_rpc [None req-ecc0bc70-4d61-4d71-86a4-16b51953c44e 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:00:59 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e201 do_prune osdmap full prune enabled Feb 23 05:00:59 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e202 e202: 6 total, 6 up, 6 in Feb 23 05:01:00 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e202: 6 total, 6 up, 6 in Feb 23 05:01:00 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:00.065 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:01:00 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:00.326 2 INFO neutron.agent.securitygroups_rpc [None req-0f7cdb10-6b15-41bf-967b-b0f718bce643 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:00 localhost podman[318985]: Feb 23 05:01:00 localhost podman[318985]: 2026-02-23 10:01:00.398006194 +0000 UTC m=+0.101822478 container create 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 05:01:00 localhost systemd[1]: Started libpod-conmon-8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308.scope. Feb 23 05:01:00 localhost podman[318985]: 2026-02-23 10:01:00.345435673 +0000 UTC m=+0.049251957 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:01:00 localhost systemd[1]: Started libcrun container. Feb 23 05:01:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/716b341786c1c3263e23387b3074c8217c19b5220ae1993515ce916482e724d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:01:00 localhost podman[318985]: 2026-02-23 10:01:00.492666438 +0000 UTC m=+0.196482692 container init 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 05:01:00 localhost podman[318985]: 2026-02-23 10:01:00.506925365 +0000 UTC m=+0.210741649 container start 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS) Feb 23 05:01:00 localhost dnsmasq[319003]: started, version 2.85 cachesize 150 Feb 23 05:01:00 localhost dnsmasq[319003]: DNS service limited to local subnets Feb 23 05:01:00 localhost dnsmasq[319003]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:01:00 localhost dnsmasq[319003]: warning: no upstream servers configured Feb 23 05:01:00 localhost dnsmasq-dhcp[319003]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 05:01:00 localhost dnsmasq[319003]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/addn_hosts - 0 addresses Feb 23 05:01:00 localhost dnsmasq-dhcp[319003]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/host Feb 23 05:01:00 localhost dnsmasq-dhcp[319003]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/opts Feb 23 05:01:00 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:01:00.687 265541 INFO neutron.agent.dhcp.agent [None req-781ebf1e-ee3c-40a5-a3fe-2998c03fa027 - - - - - -] DHCP configuration for ports {'28a87a71-0275-4bed-a2c7-c89f13d53c0a'} is completed#033[00m Feb 23 05:01:00 localhost dnsmasq[319003]: exiting on receipt of SIGTERM Feb 23 05:01:00 localhost podman[319021]: 2026-02-23 10:01:00.87356791 +0000 UTC m=+0.062026919 container kill 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 05:01:00 localhost systemd[1]: libpod-8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308.scope: Deactivated successfully. Feb 23 05:01:00 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:00.891 2 INFO neutron.agent.securitygroups_rpc [None req-9f100d5f-55ff-4c02-bab1-7dd2c773e5e4 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:00 localhost podman[319034]: 2026-02-23 10:01:00.939655015 +0000 UTC m=+0.054961016 container died 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 05:01:01 localhost podman[319034]: 2026-02-23 10:01:01.028572788 +0000 UTC m=+0.143878739 container cleanup 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0) Feb 23 05:01:01 localhost systemd[1]: libpod-conmon-8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308.scope: Deactivated successfully. Feb 23 05:01:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:01:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:01:01 localhost podman[319041]: 2026-02-23 10:01:01.053495051 +0000 UTC m=+0.147744532 container remove 8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216) Feb 23 05:01:01 localhost podman[319062]: 2026-02-23 10:01:01.152121328 +0000 UTC m=+0.088409767 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, version=9.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1770267347, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Feb 23 05:01:01 localhost podman[319062]: 2026-02-23 10:01:01.170444504 +0000 UTC m=+0.106732993 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container) Feb 23 05:01:01 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:01:01 localhost podman[319064]: 2026-02-23 10:01:01.266030626 +0000 UTC m=+0.201205641 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:01:01 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:01.282 2 INFO neutron.agent.securitygroups_rpc [None req-3d8beedf-5b31-48e0-a2b2-815a1e0fad78 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:01 localhost podman[319064]: 2026-02-23 10:01:01.302508001 +0000 UTC m=+0.237682996 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 05:01:01 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:01:01 localhost systemd[1]: tmp-crun.acftY0.mount: Deactivated successfully. Feb 23 05:01:01 localhost systemd[1]: var-lib-containers-storage-overlay-716b341786c1c3263e23387b3074c8217c19b5220ae1993515ce916482e724d4-merged.mount: Deactivated successfully. Feb 23 05:01:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e328d5c9583c54d82bc7578a8d4a7e2918247e539b7dcafd39192d465e1b308-userdata-shm.mount: Deactivated successfully. Feb 23 05:01:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e202 do_prune osdmap full prune enabled Feb 23 05:01:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e203 e203: 6 total, 6 up, 6 in Feb 23 05:01:02 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e203: 6 total, 6 up, 6 in Feb 23 05:01:02 localhost ovn_controller[157695]: 2026-02-23T10:01:02Z|00321|binding|INFO|Removing iface tapca3573cf-58 ovn-installed in OVS Feb 23 05:01:02 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:02.430 163572 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3f25bc68-8a54-46df-a558-fc9b93a655ff with type ""#033[00m Feb 23 05:01:02 localhost ovn_controller[157695]: 2026-02-23T10:01:02Z|00322|binding|INFO|Removing lport ca3573cf-58bc-4b24-8e6e-7f86bcaa638e ovn-installed in OVS Feb 23 05:01:02 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:02.432 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1::2/64 2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-9215e91e-1f4e-4608-9372-53243278a03d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9215e91e-1f4e-4608-9372-53243278a03d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90343b3c0ce240adab2c21e5c92b6952', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9b465d1-55ea-4c9a-b8cb-0871f64f66d5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=ca3573cf-58bc-4b24-8e6e-7f86bcaa638e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:01:02 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:02.434 163572 INFO neutron.agent.ovn.metadata.agent [-] Port ca3573cf-58bc-4b24-8e6e-7f86bcaa638e in datapath 9215e91e-1f4e-4608-9372-53243278a03d unbound from our chassis#033[00m Feb 23 05:01:02 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:02.436 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9215e91e-1f4e-4608-9372-53243278a03d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:01:02 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:02.441 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[598db6b0-ce86-4f12-a0b0-9a6b0ac7efe0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:01:02 localhost nova_compute[282206]: 2026-02-23 10:01:02.465 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:02 localhost nova_compute[282206]: 2026-02-23 10:01:02.467 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:02 localhost podman[319169]: Feb 23 05:01:02 localhost podman[319169]: 2026-02-23 10:01:02.643784345 +0000 UTC m=+0.098969800 container create 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 05:01:02 localhost podman[319169]: 2026-02-23 10:01:02.598513744 +0000 UTC m=+0.053699249 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:01:02 localhost systemd[1]: Started libpod-conmon-46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d.scope. Feb 23 05:01:02 localhost systemd[1]: tmp-crun.HEif4p.mount: Deactivated successfully. Feb 23 05:01:02 localhost systemd[1]: Started libcrun container. Feb 23 05:01:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0f08d8ee518831f994af401d4ecaa6867d89605187afe8b2fdadfbff6f089c22/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:01:02 localhost podman[319169]: 2026-02-23 10:01:02.736177527 +0000 UTC m=+0.191362982 container init 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0) Feb 23 05:01:02 localhost podman[319169]: 2026-02-23 10:01:02.746835402 +0000 UTC m=+0.202020857 container start 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216) Feb 23 05:01:02 localhost dnsmasq[319187]: started, version 2.85 cachesize 150 Feb 23 05:01:02 localhost dnsmasq[319187]: DNS service limited to local subnets Feb 23 05:01:02 localhost dnsmasq[319187]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:01:02 localhost dnsmasq[319187]: warning: no upstream servers configured Feb 23 05:01:02 localhost dnsmasq-dhcp[319187]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 05:01:02 localhost dnsmasq-dhcp[319187]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Feb 23 05:01:02 localhost dnsmasq[319187]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/addn_hosts - 1 addresses Feb 23 05:01:02 localhost dnsmasq-dhcp[319187]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/host Feb 23 05:01:02 localhost dnsmasq-dhcp[319187]: read /var/lib/neutron/dhcp/9215e91e-1f4e-4608-9372-53243278a03d/opts Feb 23 05:01:02 localhost ovn_controller[157695]: 2026-02-23T10:01:02Z|00323|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:01:02 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:01:02.935 265541 INFO neutron.agent.dhcp.agent [None req-f132ed7e-7992-48a7-a661-9fc0064681a1 - - - - - -] DHCP configuration for ports {'28a87a71-0275-4bed-a2c7-c89f13d53c0a', 'aa36b0a9-2380-4c2e-a8cd-48f5992ac1f5', 'ca3573cf-58bc-4b24-8e6e-7f86bcaa638e'} is completed#033[00m Feb 23 05:01:02 localhost nova_compute[282206]: 2026-02-23 10:01:02.969 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:03 localhost dnsmasq[319187]: exiting on receipt of SIGTERM Feb 23 05:01:03 localhost podman[319205]: 2026-02-23 10:01:03.110783802 +0000 UTC m=+0.067841152 container kill 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS) Feb 23 05:01:03 localhost systemd[1]: libpod-46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d.scope: Deactivated successfully. Feb 23 05:01:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:01:03 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2470762040' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:01:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:01:03 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2470762040' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:01:03 localhost podman[319217]: 2026-02-23 10:01:03.183979321 +0000 UTC m=+0.059564693 container died 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 05:01:03 localhost podman[319217]: 2026-02-23 10:01:03.228577751 +0000 UTC m=+0.104163063 container cleanup 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 05:01:03 localhost systemd[1]: libpod-conmon-46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d.scope: Deactivated successfully. Feb 23 05:01:03 localhost podman[319219]: 2026-02-23 10:01:03.266417259 +0000 UTC m=+0.133211804 container remove 46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9215e91e-1f4e-4608-9372-53243278a03d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 05:01:03 localhost nova_compute[282206]: 2026-02-23 10:01:03.278 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:03 localhost kernel: device tapca3573cf-58 left promiscuous mode Feb 23 05:01:03 localhost nova_compute[282206]: 2026-02-23 10:01:03.292 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:03 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:01:03.312 265541 INFO neutron.agent.dhcp.agent [None req-632bf260-5ef9-432d-9db3-b514c42098c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:01:03 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:01:03.313 265541 INFO neutron.agent.dhcp.agent [None req-632bf260-5ef9-432d-9db3-b514c42098c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:01:03 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:01:03.313 265541 INFO neutron.agent.dhcp.agent [None req-632bf260-5ef9-432d-9db3-b514c42098c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:01:03 localhost systemd[1]: var-lib-containers-storage-overlay-0f08d8ee518831f994af401d4ecaa6867d89605187afe8b2fdadfbff6f089c22-merged.mount: Deactivated successfully. Feb 23 05:01:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46433afe735638d07e26140d4d379e9e0550ee11808203548579ae3ef8e4cc0d-userdata-shm.mount: Deactivated successfully. Feb 23 05:01:03 localhost systemd[1]: run-netns-qdhcp\x2d9215e91e\x2d1f4e\x2d4608\x2d9372\x2d53243278a03d.mount: Deactivated successfully. Feb 23 05:01:03 localhost nova_compute[282206]: 2026-02-23 10:01:03.799 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:01:04 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3340721442' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:01:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:01:04 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3340721442' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:01:04 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:04.408 2 INFO neutron.agent.securitygroups_rpc [None req-8dbf60ea-0a4f-4b6f-8f3b-cd15a4e6bbb5 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:05 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:05.237 2 INFO neutron.agent.securitygroups_rpc [None req-03b9fc8c-f7be-48ab-90b7-4895d3ad0871 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:05 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:05.837 2 INFO neutron.agent.securitygroups_rpc [None req-33f0d874-7a76-4789-bd46-4efe27d23136 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:06 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:06.256 2 INFO neutron.agent.securitygroups_rpc [None req-57a0e794-cdb7-43ae-ad96-b51298fae526 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:01:06 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:01:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e203 do_prune osdmap full prune enabled Feb 23 05:01:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e204 e204: 6 total, 6 up, 6 in Feb 23 05:01:07 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e204: 6 total, 6 up, 6 in Feb 23 05:01:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e204 do_prune osdmap full prune enabled Feb 23 05:01:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e205 e205: 6 total, 6 up, 6 in Feb 23 05:01:08 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e205: 6 total, 6 up, 6 in Feb 23 05:01:08 localhost nova_compute[282206]: 2026-02-23 10:01:08.802 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:09 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:09.251 2 INFO neutron.agent.securitygroups_rpc [None req-8f58242c-3b5b-4ffa-9190-0f405a707baa 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:09 localhost podman[242954]: time="2026-02-23T10:01:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:01:09 localhost podman[242954]: @ - - [23/Feb/2026:10:01:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:01:09 localhost podman[242954]: @ - - [23/Feb/2026:10:01:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18822 "" "Go-http-client/1.1" Feb 23 05:01:09 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:01:09 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:01:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e205 do_prune osdmap full prune enabled Feb 23 05:01:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e206 e206: 6 total, 6 up, 6 in Feb 23 05:01:12 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e206: 6 total, 6 up, 6 in Feb 23 05:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:01:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:01:12 localhost podman[319248]: 2026-02-23 10:01:12.911885412 +0000 UTC m=+0.084302319 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Feb 23 05:01:13 localhost systemd[1]: tmp-crun.cMfIaG.mount: Deactivated successfully. Feb 23 05:01:13 localhost podman[319249]: 2026-02-23 10:01:13.008720153 +0000 UTC m=+0.174133979 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 05:01:13 localhost podman[319249]: 2026-02-23 10:01:13.020236465 +0000 UTC m=+0.185650261 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:01:13 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:01:13 localhost podman[319248]: 2026-02-23 10:01:13.033584234 +0000 UTC m=+0.206001161 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ovn_controller) Feb 23 05:01:13 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:01:13 localhost openstack_network_exporter[245358]: ERROR 10:01:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:01:13 localhost openstack_network_exporter[245358]: Feb 23 05:01:13 localhost openstack_network_exporter[245358]: ERROR 10:01:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:01:13 localhost openstack_network_exporter[245358]: Feb 23 05:01:13 localhost nova_compute[282206]: 2026-02-23 10:01:13.805 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e206 do_prune osdmap full prune enabled Feb 23 05:01:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 e207: 6 total, 6 up, 6 in Feb 23 05:01:17 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e207: 6 total, 6 up, 6 in Feb 23 05:01:18 localhost nova_compute[282206]: 2026-02-23 10:01:18.809 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:01:18 localhost podman[319296]: 2026-02-23 10:01:18.914711126 +0000 UTC m=+0.087966754 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0) Feb 23 05:01:18 localhost podman[319296]: 2026-02-23 10:01:18.954386181 +0000 UTC m=+0.127641810 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 05:01:18 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:01:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:01:20 localhost podman[319315]: 2026-02-23 10:01:20.925562619 +0000 UTC m=+0.090059219 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216) Feb 23 05:01:20 localhost podman[319315]: 2026-02-23 10:01:20.934772278 +0000 UTC m=+0.099268878 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0) Feb 23 05:01:20 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:01:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:23 localhost nova_compute[282206]: 2026-02-23 10:01:23.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:23 localhost nova_compute[282206]: 2026-02-23 10:01:23.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:01:23 localhost nova_compute[282206]: 2026-02-23 10:01:23.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:01:23 localhost nova_compute[282206]: 2026-02-23 10:01:23.811 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:25 localhost systemd[1]: tmp-crun.tv0G3v.mount: Deactivated successfully. Feb 23 05:01:25 localhost podman[319443]: 2026-02-23 10:01:25.19992067 +0000 UTC m=+0.090853915 container exec fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux , GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_BRANCH=main, release=1770267347, version=7, io.openshift.tags=rhceph ceph) Feb 23 05:01:25 localhost podman[319443]: 2026-02-23 10:01:25.338176522 +0000 UTC m=+0.229109717 container exec_died fdf07215f0388d0ebc44f1f3744080ba594441e647c300d0dade62ff5beba234 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-f1fea371-cb69-578d-a3d0-b5c472a84b46-crash-np0005626463, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Feb 23 05:01:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 05:01:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 05:01:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 05:01:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 05:01:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:01:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:01:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:28 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626466.localdomain to 836.6M Feb 23 05:01:28 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626466.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Feb 23 05:01:28 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626463.localdomain to 836.6M Feb 23 05:01:28 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 05:01:28 localhost ceph-mon[294160]: Adjusting osd_memory_target on np0005626465.localdomain to 836.6M Feb 23 05:01:28 localhost ceph-mon[294160]: Unable to set osd_memory_target on np0005626465.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Feb 23 05:01:28 localhost nova_compute[282206]: 2026-02-23 10:01:28.814 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:01:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:01:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:01:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:01:31 localhost podman[319649]: 2026-02-23 10:01:31.911331475 +0000 UTC m=+0.079990632 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 05:01:31 localhost podman[319649]: 2026-02-23 10:01:31.917650105 +0000 UTC m=+0.086309212 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 05:01:31 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:01:31 localhost podman[319648]: 2026-02-23 10:01:31.971847716 +0000 UTC m=+0.142475135 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 23 05:01:31 localhost podman[319648]: 2026-02-23 10:01:31.984662028 +0000 UTC m=+0.155289457 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 05:01:31 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:01:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:33 localhost nova_compute[282206]: 2026-02-23 10:01:33.817 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:01:33 localhost nova_compute[282206]: 2026-02-23 10:01:33.818 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:33 localhost nova_compute[282206]: 2026-02-23 10:01:33.818 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:01:33 localhost nova_compute[282206]: 2026-02-23 10:01:33.819 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:01:33 localhost nova_compute[282206]: 2026-02-23 10:01:33.819 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:01:35 localhost nova_compute[282206]: 2026-02-23 10:01:35.290 282211 WARNING oslo.service.loopingcall [-] Function 'nova.servicegroup.drivers.db.DbDriver._report_state' run outlasted interval by 7.45 sec#033[00m Feb 23 05:01:36 localhost nova_compute[282206]: 2026-02-23 10:01:36.384 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:01:36 localhost nova_compute[282206]: 2026-02-23 10:01:36.384 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:01:36 localhost nova_compute[282206]: 2026-02-23 10:01:36.385 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:01:36 localhost nova_compute[282206]: 2026-02-23 10:01:36.385 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:01:37 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:37.391 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:01:37 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:37.392 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:01:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:37 localhost nova_compute[282206]: 2026-02-23 10:01:37.419 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:01:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:01:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:01:38 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 23 05:01:38 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:01:38 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:01:38 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7", "osd", "allow rw pool=manila_data namespace=fsvolumens_7a63f65b-263e-4f0a-be43-9aace02f6e45", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.426 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.453 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.453 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.454 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.454 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.455 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.455 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.455 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.456 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.456 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.457 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.483 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.484 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.484 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.484 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.485 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.859 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:01:38 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1410413463' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:01:38 localhost nova_compute[282206]: 2026-02-23 10:01:38.946 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.052 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.053 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:01:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:01:39 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.287 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.289 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11283MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.289 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.289 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:01:39 localhost podman[242954]: time="2026-02-23T10:01:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:01:39 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:39.394 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:01:39 localhost podman[242954]: @ - - [23/Feb/2026:10:01:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:01:39 localhost podman[242954]: @ - - [23/Feb/2026:10:01:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18827 "" "Go-http-client/1.1" Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.441 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.442 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.443 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.539 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:01:39 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:39.677 2 INFO neutron.agent.securitygroups_rpc [None req-0ebe300c-0924-4a3a-86fe-0f106df51381 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:01:39 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2001042167' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.978 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:01:39 localhost nova_compute[282206]: 2026-02-23 10:01:39.985 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:01:40 localhost nova_compute[282206]: 2026-02-23 10:01:40.013 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:01:40 localhost nova_compute[282206]: 2026-02-23 10:01:40.017 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:01:40 localhost nova_compute[282206]: 2026-02-23 10:01:40.018 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:01:40 localhost ovn_controller[157695]: 2026-02-23T10:01:40Z|00324|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 23 05:01:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:01:41 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:01:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:43 localhost openstack_network_exporter[245358]: ERROR 10:01:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:01:43 localhost openstack_network_exporter[245358]: Feb 23 05:01:43 localhost openstack_network_exporter[245358]: ERROR 10:01:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:01:43 localhost openstack_network_exporter[245358]: Feb 23 05:01:43 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:43.815 2 INFO neutron.agent.securitygroups_rpc [None req-72581ce1-6ecd-4794-bdfa-9dea488ea111 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['48663913-ae52-424c-8374-b7539096caba']#033[00m Feb 23 05:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:01:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:01:43 localhost nova_compute[282206]: 2026-02-23 10:01:43.862 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:01:43 localhost systemd[1]: tmp-crun.IRThQN.mount: Deactivated successfully. Feb 23 05:01:43 localhost podman[319736]: 2026-02-23 10:01:43.941709558 +0000 UTC m=+0.105680160 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:01:43 localhost podman[319736]: 2026-02-23 10:01:43.9545088 +0000 UTC m=+0.118479392 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:01:43 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:01:44 localhost podman[319735]: 2026-02-23 10:01:44.037443225 +0000 UTC m=+0.204189924 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller) Feb 23 05:01:44 localhost podman[319735]: 2026-02-23 10:01:44.108567098 +0000 UTC m=+0.275313747 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true) Feb 23 05:01:44 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:01:44 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 23 05:01:45 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:45.596 2 INFO neutron.agent.securitygroups_rpc [None req-97c39ba0-e75b-404a-baf7-3d9225783656 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['48663913-ae52-424c-8374-b7539096caba', '3384fe18-0fab-4c8a-9159-5a07fe1d4f48']#033[00m Feb 23 05:01:45 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:45.971 2 INFO neutron.agent.securitygroups_rpc [None req-62619d1c-ae99-4a07-8196-e49ad1562f12 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['3384fe18-0fab-4c8a-9159-5a07fe1d4f48']#033[00m Feb 23 05:01:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:47 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-557795333", "format": "json"} : dispatch Feb 23 05:01:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:01:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:01:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:01:48 localhost nova_compute[282206]: 2026-02-23 10:01:48.013 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:48 localhost nova_compute[282206]: 2026-02-23 10:01:48.013 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:01:48 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:48.558 2 INFO neutron.agent.securitygroups_rpc [None req-deb2d108-d73d-4f3b-b409-b0d5aa38dbb2 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['7ccfcbe5-3a12-4044-a554-c033a2966e5e']#033[00m Feb 23 05:01:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:01:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:01:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:01:48.562 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:01:48 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:01:48 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:01:48 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-557795333", "caps": ["mds", "allow rw path=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24", "osd", "allow rw pool=manila_data namespace=fsvolumens_c3aedd71-b342-4920-afd2-d5c6fd4776d2", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:01:48 localhost nova_compute[282206]: 2026-02-23 10:01:48.865 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:01:49 localhost podman[319783]: 2026-02-23 10:01:49.909952766 +0000 UTC m=+0.084144494 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 05:01:49 localhost podman[319783]: 2026-02-23 10:01:49.923290375 +0000 UTC m=+0.097482103 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0) Feb 23 05:01:49 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:01:50 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:50.145 2 INFO neutron.agent.securitygroups_rpc [None req-80e2a041-7b6e-4f6c-b102-2630aa52a7b1 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['fdc27d14-90ab-419e-a68e-458d1f31be69', 'abc26baf-67f8-4703-8e61-63db3bbb7b3b', '7ccfcbe5-3a12-4044-a554-c033a2966e5e']#033[00m Feb 23 05:01:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:01:50 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:01:50 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:50.833 2 INFO neutron.agent.securitygroups_rpc [None req-5d850956-08e8-4589-ad52-05619933d70c 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['abc26baf-67f8-4703-8e61-63db3bbb7b3b', 'fdc27d14-90ab-419e-a68e-458d1f31be69']#033[00m Feb 23 05:01:51 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=Joe,client_metadata.root=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24],prefix=session evict} (starting...) Feb 23 05:01:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:01:51 localhost podman[319802]: 2026-02-23 10:01:51.910100231 +0000 UTC m=+0.085558548 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 05:01:51 localhost podman[319802]: 2026-02-23 10:01:51.918278338 +0000 UTC m=+0.093737135 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:01:51 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:01:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:53 localhost neutron_sriov_agent[258207]: 2026-02-23 10:01:53.182 2 INFO neutron.agent.securitygroups_rpc [None req-6b3edf75-e2b6-4f03-a982-c5a8a9628048 4ce76e2a79a849e8a6b3b31c05f9bc96 90343b3c0ce240adab2c21e5c92b6952 - - default default] Security group member updated ['9a671aa5-4d76-4c1e-8de2-506f29ad907b']#033[00m Feb 23 05:01:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:01:53 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:01:53 localhost nova_compute[282206]: 2026-02-23 10:01:53.869 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:01:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"} v 0) Feb 23 05:01:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"} : dispatch Feb 23 05:01:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"}]': finished Feb 23 05:01:54 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-557795333", "format": "json"} : dispatch Feb 23 05:01:54 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"} : dispatch Feb 23 05:01:54 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"} : dispatch Feb 23 05:01:54 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-557795333,client_metadata.root=/volumes/_nogroup/c3aedd71-b342-4920-afd2-d5c6fd4776d2/28f0b5f4-80b5-453b-a6b1-657204142a24],prefix=session evict} (starting...) Feb 23 05:01:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e207 do_prune osdmap full prune enabled Feb 23 05:01:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-557795333"}]': finished Feb 23 05:01:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e208 e208: 6 total, 6 up, 6 in Feb 23 05:01:55 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e208: 6 total, 6 up, 6 in Feb 23 05:01:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:01:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0) Feb 23 05:01:58 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Feb 23 05:01:58 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Feb 23 05:01:58 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=Joe,client_metadata.root=/volumes/_nogroup/7a63f65b-263e-4f0a-be43-9aace02f6e45/c5b2858c-2130-4d42-a52c-3132bbd1bfb7],prefix=session evict} (starting...) Feb 23 05:01:58 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Feb 23 05:01:58 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Feb 23 05:01:58 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Feb 23 05:01:58 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Feb 23 05:01:58 localhost nova_compute[282206]: 2026-02-23 10:01:58.871 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:01:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:01:58 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e208 do_prune osdmap full prune enabled Feb 23 05:02:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e209 e209: 6 total, 6 up, 6 in Feb 23 05:02:00 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e209: 6 total, 6 up, 6 in Feb 23 05:02:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:00 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e209 do_prune osdmap full prune enabled Feb 23 05:02:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e210 e210: 6 total, 6 up, 6 in Feb 23 05:02:01 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e210: 6 total, 6 up, 6 in Feb 23 05:02:01 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:01.123 265541 INFO neutron.agent.linux.ip_lib [None req-52c672d4-2817-4e80-a6b6-55e7dbc6cefe - - - - - -] Device tap1dc46c99-c0 cannot be used as it has no MAC address#033[00m Feb 23 05:02:01 localhost nova_compute[282206]: 2026-02-23 10:02:01.147 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:01 localhost kernel: device tap1dc46c99-c0 entered promiscuous mode Feb 23 05:02:01 localhost NetworkManager[5974]: [1771840921.1556] manager: (tap1dc46c99-c0): new Generic device (/org/freedesktop/NetworkManager/Devices/52) Feb 23 05:02:01 localhost nova_compute[282206]: 2026-02-23 10:02:01.159 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:01 localhost systemd-udevd[319829]: Network interface NamePolicy= disabled on kernel command line. Feb 23 05:02:01 localhost ovn_controller[157695]: 2026-02-23T10:02:01Z|00325|binding|INFO|Claiming lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 for this chassis. Feb 23 05:02:01 localhost ovn_controller[157695]: 2026-02-23T10:02:01Z|00326|binding|INFO|1dc46c99-c0ab-47c0-8b05-6120a8497956: Claiming unknown Feb 23 05:02:01 localhost ovn_controller[157695]: 2026-02-23T10:02:01Z|00327|binding|INFO|Setting lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 ovn-installed in OVS Feb 23 05:02:01 localhost nova_compute[282206]: 2026-02-23 10:02:01.199 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:01 localhost nova_compute[282206]: 2026-02-23 10:02:01.234 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:01 localhost nova_compute[282206]: 2026-02-23 10:02:01.264 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:01 localhost ovn_controller[157695]: 2026-02-23T10:02:01Z|00328|binding|INFO|Setting lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 up in Southbound Feb 23 05:02:01 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:01.281 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-02994efd-16d1-4091-9839-70c330f56226', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02994efd-16d1-4091-9839-70c330f56226', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a05abb56-1acf-40c9-887f-cac11ff4663b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1dc46c99-c0ab-47c0-8b05-6120a8497956) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:01 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:01.283 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1dc46c99-c0ab-47c0-8b05-6120a8497956 in datapath 02994efd-16d1-4091-9839-70c330f56226 bound to our chassis#033[00m Feb 23 05:02:01 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:01.285 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02994efd-16d1-4091-9839-70c330f56226 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:01 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:01.288 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1664133f-1441-4b7b-9912-4146b871877a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:02 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch Feb 23 05:02:02 localhost podman[319883]: Feb 23 05:02:02 localhost podman[319883]: 2026-02-23 10:02:02.12316297 +0000 UTC m=+0.103431879 container create e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:02:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:02:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:02:02 localhost systemd[1]: Started libpod-conmon-e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805.scope. Feb 23 05:02:02 localhost podman[319883]: 2026-02-23 10:02:02.073155979 +0000 UTC m=+0.053424908 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:02:02 localhost systemd[1]: Started libcrun container. Feb 23 05:02:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b5625997ec8f95dd5a95ab1651fea2d62360cf9a611d1d9f87c7783ab5b51320/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:02:02 localhost podman[319883]: 2026-02-23 10:02:02.205554437 +0000 UTC m=+0.185823336 container init e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:02:02 localhost podman[319883]: 2026-02-23 10:02:02.216209502 +0000 UTC m=+0.196478401 container start e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0) Feb 23 05:02:02 localhost dnsmasq[319923]: started, version 2.85 cachesize 150 Feb 23 05:02:02 localhost dnsmasq[319923]: DNS service limited to local subnets Feb 23 05:02:02 localhost dnsmasq[319923]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:02:02 localhost dnsmasq[319923]: warning: no upstream servers configured Feb 23 05:02:02 localhost dnsmasq-dhcp[319923]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d Feb 23 05:02:02 localhost dnsmasq[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/addn_hosts - 0 addresses Feb 23 05:02:02 localhost dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/host Feb 23 05:02:02 localhost dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/opts Feb 23 05:02:02 localhost podman[319897]: 2026-02-23 10:02:02.268263137 +0000 UTC m=+0.100011372 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1770267347, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., managed_by=edpm_ansible) Feb 23 05:02:02 localhost podman[319897]: 2026-02-23 10:02:02.277719734 +0000 UTC m=+0.109467919 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9) Feb 23 05:02:02 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:02:02 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:02.339 265541 INFO neutron.agent.dhcp.agent [None req-6b4183d0-b8bf-44ce-82c6-0e22de4cb49a - - - - - -] DHCP configuration for ports {'12437aba-a85e-444a-8c2a-043bc899c9b2'} is completed#033[00m Feb 23 05:02:02 localhost podman[319898]: 2026-02-23 10:02:02.370504628 +0000 UTC m=+0.197669099 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 05:02:02 localhost podman[319898]: 2026-02-23 10:02:02.41131933 +0000 UTC m=+0.238483811 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:02:02 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:02:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e210 do_prune osdmap full prune enabled Feb 23 05:02:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e211 e211: 6 total, 6 up, 6 in Feb 23 05:02:02 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e211: 6 total, 6 up, 6 in Feb 23 05:02:02 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:02.646 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:02Z, description=, device_id=d11c654a-ee3c-4ca8-93a5-f268ff8b2e3b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=11716704-7d2e-4ac8-ab68-6757822cada4, ip_allocation=immediate, mac_address=fa:16:3e:fc:79:0a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:01:58Z, description=, dns_domain=, id=02994efd-16d1-4091-9839-70c330f56226, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1823901390, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41387, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2973, status=ACTIVE, subnets=['cdc6b546-53fa-420b-b344-ddddb12b955f'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:00Z, vlan_transparent=None, network_id=02994efd-16d1-4091-9839-70c330f56226, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2982, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:02Z on network 02994efd-16d1-4091-9839-70c330f56226#033[00m Feb 23 05:02:02 localhost dnsmasq[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/addn_hosts - 1 addresses Feb 23 05:02:02 localhost dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/host Feb 23 05:02:02 localhost podman[319960]: 2026-02-23 10:02:02.830673199 +0000 UTC m=+0.057964940 container kill e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0) Feb 23 05:02:02 localhost dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/opts Feb 23 05:02:03 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:03.172 265541 INFO neutron.agent.dhcp.agent [None req-a6593d7d-3d1a-41bb-9271-b655eb5b020a - - - - - -] DHCP configuration for ports {'11716704-7d2e-4ac8-ab68-6757822cada4'} is completed#033[00m Feb 23 05:02:03 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:03.640 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:02Z, description=, device_id=d11c654a-ee3c-4ca8-93a5-f268ff8b2e3b, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=11716704-7d2e-4ac8-ab68-6757822cada4, ip_allocation=immediate, mac_address=fa:16:3e:fc:79:0a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:01:58Z, description=, dns_domain=, id=02994efd-16d1-4091-9839-70c330f56226, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1823901390, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41387, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2973, status=ACTIVE, subnets=['cdc6b546-53fa-420b-b344-ddddb12b955f'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:00Z, vlan_transparent=None, network_id=02994efd-16d1-4091-9839-70c330f56226, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2982, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:02Z on network 02994efd-16d1-4091-9839-70c330f56226#033[00m Feb 23 05:02:03 localhost dnsmasq[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/addn_hosts - 1 addresses Feb 23 05:02:03 localhost dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/host Feb 23 05:02:03 localhost podman[319998]: 2026-02-23 10:02:03.832562254 +0000 UTC m=+0.062379179 container kill e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2) Feb 23 05:02:03 localhost dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/opts Feb 23 05:02:03 localhost nova_compute[282206]: 2026-02-23 10:02:03.875 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:04 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:04.113 265541 INFO neutron.agent.dhcp.agent [None req-df82d37c-32a0-42a4-83dd-76b9a0e4fa20 - - - - - -] DHCP configuration for ports {'11716704-7d2e-4ac8-ab68-6757822cada4'} is completed#033[00m Feb 23 05:02:04 localhost dnsmasq[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/addn_hosts - 0 addresses Feb 23 05:02:04 localhost dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/host Feb 23 05:02:04 localhost dnsmasq-dhcp[319923]: read /var/lib/neutron/dhcp/02994efd-16d1-4091-9839-70c330f56226/opts Feb 23 05:02:04 localhost podman[320034]: 2026-02-23 10:02:04.473003059 +0000 UTC m=+0.061841563 container kill e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 05:02:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:02:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:04 localhost kernel: device tap1dc46c99-c0 left promiscuous mode Feb 23 05:02:04 localhost nova_compute[282206]: 2026-02-23 10:02:04.725 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:04 localhost ovn_controller[157695]: 2026-02-23T10:02:04Z|00329|binding|INFO|Releasing lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 from this chassis (sb_readonly=0) Feb 23 05:02:04 localhost ovn_controller[157695]: 2026-02-23T10:02:04Z|00330|binding|INFO|Setting lport 1dc46c99-c0ab-47c0-8b05-6120a8497956 down in Southbound Feb 23 05:02:04 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:04.740 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-02994efd-16d1-4091-9839-70c330f56226', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-02994efd-16d1-4091-9839-70c330f56226', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a05abb56-1acf-40c9-887f-cac11ff4663b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1dc46c99-c0ab-47c0-8b05-6120a8497956) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:04 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:04.743 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1dc46c99-c0ab-47c0-8b05-6120a8497956 in datapath 02994efd-16d1-4091-9839-70c330f56226 unbound from our chassis#033[00m Feb 23 05:02:04 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:04.745 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 02994efd-16d1-4091-9839-70c330f56226 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:04 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:04.746 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd0397e-6ec1-4464-a43e-09584b77d699]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:04 localhost nova_compute[282206]: 2026-02-23 10:02:04.756 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:05 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 23 05:02:05 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:05 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:05 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819", "osd", "allow rw pool=manila_data namespace=fsvolumens_ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0. Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.190899) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58 Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925190935, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2560, "num_deletes": 275, "total_data_size": 3244680, "memory_usage": 3296672, "flush_reason": "Manual Compaction"} Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925212796, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3180588, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30559, "largest_seqno": 33118, "table_properties": {"data_size": 3169010, "index_size": 7443, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 28004, "raw_average_key_size": 22, "raw_value_size": 3144663, "raw_average_value_size": 2579, "num_data_blocks": 311, "num_entries": 1219, "num_filter_entries": 1219, "num_deletions": 275, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840797, "oldest_key_time": 1771840797, "file_creation_time": 1771840925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}} Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 21994 microseconds, and 7458 cpu microseconds. Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.212850) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3180588 bytes OK Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.212915) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.215294) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.215315) EVENT_LOG_v1 {"time_micros": 1771840925215309, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.215339) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3233133, prev total WAL file size 3233133, number of live WAL files 2. Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.216194) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end) Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3106KB)], [57(14MB)] Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925216272, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18594771, "oldest_snapshot_seqno": -1} Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13268 keys, 17389646 bytes, temperature: kUnknown Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925324551, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 17389646, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17314048, "index_size": 41321, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33221, "raw_key_size": 355930, "raw_average_key_size": 26, "raw_value_size": 17088119, "raw_average_value_size": 1287, "num_data_blocks": 1557, "num_entries": 13268, "num_filter_entries": 13268, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771840925, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}} Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.325011) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 17389646 bytes Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.328006) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.6 rd, 160.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.0, 14.7 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(11.3) write-amplify(5.5) OK, records in: 13832, records dropped: 564 output_compression: NoCompression Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.328037) EVENT_LOG_v1 {"time_micros": 1771840925328023, "job": 34, "event": "compaction_finished", "compaction_time_micros": 108357, "compaction_time_cpu_micros": 47967, "output_level": 6, "num_output_files": 1, "total_output_size": 17389646, "num_input_records": 13832, "num_output_records": 13268, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925328623, "job": 34, "event": "table_file_deletion", "file_number": 59} Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771840925331478, "job": 34, "event": "table_file_deletion", "file_number": 57} Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.216085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:02:05 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:02:05.331749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:02:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e211 do_prune osdmap full prune enabled Feb 23 05:02:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e212 e212: 6 total, 6 up, 6 in Feb 23 05:02:05 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e212: 6 total, 6 up, 6 in Feb 23 05:02:06 localhost dnsmasq[319923]: exiting on receipt of SIGTERM Feb 23 05:02:06 localhost systemd[1]: libpod-e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805.scope: Deactivated successfully. Feb 23 05:02:06 localhost podman[320075]: 2026-02-23 10:02:06.271012835 +0000 UTC m=+0.071607339 container kill e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 05:02:06 localhost podman[320087]: 2026-02-23 10:02:06.347508238 +0000 UTC m=+0.065105475 container died e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:02:06 localhost podman[320087]: 2026-02-23 10:02:06.386061009 +0000 UTC m=+0.103658206 container cleanup e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:02:06 localhost systemd[1]: libpod-conmon-e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805.scope: Deactivated successfully. Feb 23 05:02:06 localhost podman[320089]: 2026-02-23 10:02:06.423710341 +0000 UTC m=+0.133568196 container remove e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-02994efd-16d1-4091-9839-70c330f56226, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 05:02:06 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:06.457 265541 INFO neutron.agent.dhcp.agent [None req-264e6601-72c4-439f-84bc-7e07da93162e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:02:06 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:06.644 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:02:06 localhost ovn_controller[157695]: 2026-02-23T10:02:06Z|00331|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:02:07 localhost nova_compute[282206]: 2026-02-23 10:02:07.063 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:07 localhost systemd[1]: var-lib-containers-storage-overlay-b5625997ec8f95dd5a95ab1651fea2d62360cf9a611d1d9f87c7783ab5b51320-merged.mount: Deactivated successfully. Feb 23 05:02:07 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e4e6266ea23b53936bba2075cfc553a8c045b26e4497c28e2b060fa44c524805-userdata-shm.mount: Deactivated successfully. Feb 23 05:02:07 localhost systemd[1]: run-netns-qdhcp\x2d02994efd\x2d16d1\x2d4091\x2d9839\x2d70c330f56226.mount: Deactivated successfully. Feb 23 05:02:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e212 do_prune osdmap full prune enabled Feb 23 05:02:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e213 e213: 6 total, 6 up, 6 in Feb 23 05:02:07 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e213: 6 total, 6 up, 6 in Feb 23 05:02:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:08 localhost nova_compute[282206]: 2026-02-23 10:02:08.877 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:09 localhost neutron_sriov_agent[258207]: 2026-02-23 10:02:09.223 2 INFO neutron.agent.securitygroups_rpc [None req-34a9c0cf-78ea-4a50-a570-1c89dfa87f59 ccd9ce6e3fef42b59d2107f1a22eac97 68a48b471ed84048aeb651374fff5111 - - default default] Security group member updated ['712b70a2-0074-4f4c-8d5a-c22b0f563b07']#033[00m Feb 23 05:02:09 localhost podman[242954]: time="2026-02-23T10:02:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:02:09 localhost podman[242954]: @ - - [23/Feb/2026:10:02:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:02:09 localhost podman[242954]: @ - - [23/Feb/2026:10:02:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18825 "" "Go-http-client/1.1" Feb 23 05:02:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e213 do_prune osdmap full prune enabled Feb 23 05:02:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e214 e214: 6 total, 6 up, 6 in Feb 23 05:02:10 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e214: 6 total, 6 up, 6 in Feb 23 05:02:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e214 do_prune osdmap full prune enabled Feb 23 05:02:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e215 e215: 6 total, 6 up, 6 in Feb 23 05:02:11 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e215: 6 total, 6 up, 6 in Feb 23 05:02:11 localhost neutron_sriov_agent[258207]: 2026-02-23 10:02:11.788 2 INFO neutron.agent.securitygroups_rpc [None req-90d4154c-098d-45e4-8f77-47336730be40 ccd9ce6e3fef42b59d2107f1a22eac97 68a48b471ed84048aeb651374fff5111 - - default default] Security group member updated ['712b70a2-0074-4f4c-8d5a-c22b0f563b07']#033[00m Feb 23 05:02:12 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 23 05:02:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e215 do_prune osdmap full prune enabled Feb 23 05:02:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e216 e216: 6 total, 6 up, 6 in Feb 23 05:02:12 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e216: 6 total, 6 up, 6 in Feb 23 05:02:13 localhost openstack_network_exporter[245358]: ERROR 10:02:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:02:13 localhost openstack_network_exporter[245358]: Feb 23 05:02:13 localhost openstack_network_exporter[245358]: ERROR 10:02:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:02:13 localhost openstack_network_exporter[245358]: Feb 23 05:02:13 localhost nova_compute[282206]: 2026-02-23 10:02:13.880 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:02:13 localhost nova_compute[282206]: 2026-02-23 10:02:13.881 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:13 localhost nova_compute[282206]: 2026-02-23 10:02:13.881 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:02:13 localhost nova_compute[282206]: 2026-02-23 10:02:13.881 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:02:13 localhost nova_compute[282206]: 2026-02-23 10:02:13.882 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:02:13 localhost nova_compute[282206]: 2026-02-23 10:02:13.884 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:14 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=david,client_metadata.root=/volumes/_nogroup/d958ade4-5b7f-45eb-b23d-cb42046e5d2f/7b3f1ec9-37eb-4781-a710-21b4c27d3f21],prefix=session evict} (starting...) Feb 23 05:02:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:02:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:02:14 localhost podman[320117]: 2026-02-23 10:02:14.919354303 +0000 UTC m=+0.087272563 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216) Feb 23 05:02:14 localhost systemd[1]: tmp-crun.vu5Fs6.mount: Deactivated successfully. Feb 23 05:02:14 localhost podman[320117]: 2026-02-23 10:02:14.981652609 +0000 UTC m=+0.149570849 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 23 05:02:14 localhost podman[320118]: 2026-02-23 10:02:14.984464908 +0000 UTC m=+0.146782542 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:02:15 localhost podman[320118]: 2026-02-23 10:02:15.017463304 +0000 UTC m=+0.179780888 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:02:15 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:02:15 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:02:15 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:15.348 265541 INFO neutron.agent.linux.ip_lib [None req-391867ec-2d21-4208-aff1-162f6dbd896d - - - - - -] Device tap1634e81d-e7 cannot be used as it has no MAC address#033[00m Feb 23 05:02:15 localhost nova_compute[282206]: 2026-02-23 10:02:15.375 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:15 localhost kernel: device tap1634e81d-e7 entered promiscuous mode Feb 23 05:02:15 localhost nova_compute[282206]: 2026-02-23 10:02:15.386 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:15 localhost ovn_controller[157695]: 2026-02-23T10:02:15Z|00332|binding|INFO|Claiming lport 1634e81d-e702-4faf-9e73-2065f8f0e08b for this chassis. Feb 23 05:02:15 localhost ovn_controller[157695]: 2026-02-23T10:02:15Z|00333|binding|INFO|1634e81d-e702-4faf-9e73-2065f8f0e08b: Claiming unknown Feb 23 05:02:15 localhost NetworkManager[5974]: [1771840935.3947] manager: (tap1634e81d-e7): new Generic device (/org/freedesktop/NetworkManager/Devices/53) Feb 23 05:02:15 localhost systemd-udevd[320176]: Network interface NamePolicy= disabled on kernel command line. Feb 23 05:02:15 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:15.401 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-e63b9444-12b5-401f-bd30-af34ee321bad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e63b9444-12b5-401f-bd30-af34ee321bad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f42d0d0d-6ed8-4a36-9fc6-62bf3138a3ca, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1634e81d-e702-4faf-9e73-2065f8f0e08b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:15 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:15.404 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1634e81d-e702-4faf-9e73-2065f8f0e08b in datapath e63b9444-12b5-401f-bd30-af34ee321bad bound to our chassis#033[00m Feb 23 05:02:15 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:15.406 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e63b9444-12b5-401f-bd30-af34ee321bad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:15 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:15.408 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[49087957-68ee-4269-b096-6678990299d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:15 localhost journal[231253]: ethtool ioctl error on tap1634e81d-e7: No such device Feb 23 05:02:15 localhost journal[231253]: ethtool ioctl error on tap1634e81d-e7: No such device Feb 23 05:02:15 localhost ovn_controller[157695]: 2026-02-23T10:02:15Z|00334|binding|INFO|Setting lport 1634e81d-e702-4faf-9e73-2065f8f0e08b ovn-installed in OVS Feb 23 05:02:15 localhost ovn_controller[157695]: 2026-02-23T10:02:15Z|00335|binding|INFO|Setting lport 1634e81d-e702-4faf-9e73-2065f8f0e08b up in Southbound Feb 23 05:02:15 localhost nova_compute[282206]: 2026-02-23 10:02:15.432 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:15 localhost journal[231253]: ethtool ioctl error on tap1634e81d-e7: No such device Feb 23 05:02:15 localhost journal[231253]: ethtool ioctl error on tap1634e81d-e7: No such device Feb 23 05:02:15 localhost journal[231253]: ethtool ioctl error on tap1634e81d-e7: No such device Feb 23 05:02:15 localhost journal[231253]: ethtool ioctl error on tap1634e81d-e7: No such device Feb 23 05:02:15 localhost journal[231253]: ethtool ioctl error on tap1634e81d-e7: No such device Feb 23 05:02:15 localhost journal[231253]: ethtool ioctl error on tap1634e81d-e7: No such device Feb 23 05:02:15 localhost nova_compute[282206]: 2026-02-23 10:02:15.476 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:15 localhost nova_compute[282206]: 2026-02-23 10:02:15.510 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:16 localhost podman[320247]: Feb 23 05:02:16 localhost podman[320247]: 2026-02-23 10:02:16.355526786 +0000 UTC m=+0.101334403 container create 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:02:16 localhost systemd[1]: Started libpod-conmon-5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591.scope. Feb 23 05:02:16 localhost podman[320247]: 2026-02-23 10:02:16.308317324 +0000 UTC m=+0.054124981 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:02:16 localhost systemd[1]: Started libcrun container. Feb 23 05:02:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ebec7039e60db65acdf7c0ecd16e923040d414148abcea66fbe688ace2f9805/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:02:16 localhost podman[320247]: 2026-02-23 10:02:16.435082176 +0000 UTC m=+0.180889793 container init 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 05:02:16 localhost podman[320247]: 2026-02-23 10:02:16.441008242 +0000 UTC m=+0.186815869 container start 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:02:16 localhost dnsmasq[320265]: started, version 2.85 cachesize 150 Feb 23 05:02:16 localhost dnsmasq[320265]: DNS service limited to local subnets Feb 23 05:02:16 localhost dnsmasq[320265]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:02:16 localhost dnsmasq[320265]: warning: no upstream servers configured Feb 23 05:02:16 localhost dnsmasq-dhcp[320265]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 05:02:16 localhost dnsmasq[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/addn_hosts - 0 addresses Feb 23 05:02:16 localhost dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/host Feb 23 05:02:16 localhost dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/opts Feb 23 05:02:16 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:16.507 265541 INFO neutron.agent.dhcp.agent [None req-391867ec-2d21-4208-aff1-162f6dbd896d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:15Z, description=, device_id=2cd213c5-1cc0-4ece-97cb-aacda1a21f15, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a163dd8c-da34-4abd-b418-3c438cf8fe6f, ip_allocation=immediate, mac_address=fa:16:3e:b6:17:9f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:13Z, description=, dns_domain=, id=e63b9444-12b5-401f-bd30-af34ee321bad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1517713891, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54215, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3015, status=ACTIVE, subnets=['448cc415-671f-4bd9-897a-4370ca9edf8d'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:14Z, vlan_transparent=None, network_id=e63b9444-12b5-401f-bd30-af34ee321bad, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3031, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:15Z on network e63b9444-12b5-401f-bd30-af34ee321bad#033[00m Feb 23 05:02:16 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:16.574 265541 INFO neutron.agent.dhcp.agent [None req-01c86bfa-5238-444b-ad0e-daa85ed342cb - - - - - -] DHCP configuration for ports {'a52ef47a-5244-4d45-bfde-ab5e56a9da0d'} is completed#033[00m Feb 23 05:02:16 localhost dnsmasq[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/addn_hosts - 1 addresses Feb 23 05:02:16 localhost podman[320284]: 2026-02-23 10:02:16.720102946 +0000 UTC m=+0.065821858 container kill 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Feb 23 05:02:16 localhost dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/host Feb 23 05:02:16 localhost dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/opts Feb 23 05:02:16 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:16.938 265541 INFO neutron.agent.dhcp.agent [None req-391867ec-2d21-4208-aff1-162f6dbd896d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:15Z, description=, device_id=2cd213c5-1cc0-4ece-97cb-aacda1a21f15, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a163dd8c-da34-4abd-b418-3c438cf8fe6f, ip_allocation=immediate, mac_address=fa:16:3e:b6:17:9f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:13Z, description=, dns_domain=, id=e63b9444-12b5-401f-bd30-af34ee321bad, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1517713891, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54215, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3015, status=ACTIVE, subnets=['448cc415-671f-4bd9-897a-4370ca9edf8d'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:14Z, vlan_transparent=None, network_id=e63b9444-12b5-401f-bd30-af34ee321bad, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3031, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:15Z on network e63b9444-12b5-401f-bd30-af34ee321bad#033[00m Feb 23 05:02:17 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:17.031 265541 INFO neutron.agent.dhcp.agent [None req-2fe1288e-0211-40c4-91a3-6b94f1898434 - - - - - -] DHCP configuration for ports {'a163dd8c-da34-4abd-b418-3c438cf8fe6f'} is completed#033[00m Feb 23 05:02:17 localhost dnsmasq[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/addn_hosts - 1 addresses Feb 23 05:02:17 localhost dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/host Feb 23 05:02:17 localhost podman[320323]: 2026-02-23 10:02:17.149813262 +0000 UTC m=+0.066424557 container kill 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 05:02:17 localhost dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/opts Feb 23 05:02:17 localhost systemd[1]: tmp-crun.GXbD42.mount: Deactivated successfully. Feb 23 05:02:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e216 do_prune osdmap full prune enabled Feb 23 05:02:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e217 e217: 6 total, 6 up, 6 in Feb 23 05:02:17 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e217: 6 total, 6 up, 6 in Feb 23 05:02:17 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:17.474 265541 INFO neutron.agent.dhcp.agent [None req-ab67c062-1b9e-4f32-8032-c93b18a893e9 - - - - - -] DHCP configuration for ports {'a163dd8c-da34-4abd-b418-3c438cf8fe6f'} is completed#033[00m Feb 23 05:02:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:17 localhost dnsmasq[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/addn_hosts - 0 addresses Feb 23 05:02:17 localhost podman[320360]: 2026-02-23 10:02:17.726974908 +0000 UTC m=+0.070633439 container kill 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216) Feb 23 05:02:17 localhost dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/host Feb 23 05:02:17 localhost dnsmasq-dhcp[320265]: read /var/lib/neutron/dhcp/e63b9444-12b5-401f-bd30-af34ee321bad/opts Feb 23 05:02:17 localhost ovn_controller[157695]: 2026-02-23T10:02:17Z|00336|binding|INFO|Releasing lport 1634e81d-e702-4faf-9e73-2065f8f0e08b from this chassis (sb_readonly=0) Feb 23 05:02:17 localhost ovn_controller[157695]: 2026-02-23T10:02:17Z|00337|binding|INFO|Setting lport 1634e81d-e702-4faf-9e73-2065f8f0e08b down in Southbound Feb 23 05:02:17 localhost kernel: device tap1634e81d-e7 left promiscuous mode Feb 23 05:02:17 localhost nova_compute[282206]: 2026-02-23 10:02:17.945 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:17 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:17.957 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-e63b9444-12b5-401f-bd30-af34ee321bad', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e63b9444-12b5-401f-bd30-af34ee321bad', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f42d0d0d-6ed8-4a36-9fc6-62bf3138a3ca, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=1634e81d-e702-4faf-9e73-2065f8f0e08b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:17 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:17.959 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 1634e81d-e702-4faf-9e73-2065f8f0e08b in datapath e63b9444-12b5-401f-bd30-af34ee321bad unbound from our chassis#033[00m Feb 23 05:02:17 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:17.961 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e63b9444-12b5-401f-bd30-af34ee321bad or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:17 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:17.963 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[fc5ce108-ba8a-4843-9849-936642f38428]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:17 localhost nova_compute[282206]: 2026-02-23 10:02:17.966 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0) Feb 23 05:02:18 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Feb 23 05:02:18 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Feb 23 05:02:18 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=david,client_metadata.root=/volumes/_nogroup/ce93cb3d-9c0e-41e4-ad7d-9451b8b5d7b4/555eb612-b64e-47a9-a736-f6e77cc4a819],prefix=session evict} (starting...) Feb 23 05:02:18 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Feb 23 05:02:18 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Feb 23 05:02:18 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Feb 23 05:02:18 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Feb 23 05:02:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e217 do_prune osdmap full prune enabled Feb 23 05:02:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e218 e218: 6 total, 6 up, 6 in Feb 23 05:02:18 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e218: 6 total, 6 up, 6 in Feb 23 05:02:18 localhost dnsmasq[320265]: exiting on receipt of SIGTERM Feb 23 05:02:18 localhost podman[320401]: 2026-02-23 10:02:18.840593791 +0000 UTC m=+0.065822367 container kill 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:02:18 localhost systemd[1]: libpod-5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591.scope: Deactivated successfully. Feb 23 05:02:18 localhost nova_compute[282206]: 2026-02-23 10:02:18.886 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:18 localhost podman[320416]: 2026-02-23 10:02:18.923613319 +0000 UTC m=+0.065447866 container died 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2) Feb 23 05:02:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591-userdata-shm.mount: Deactivated successfully. Feb 23 05:02:18 localhost podman[320416]: 2026-02-23 10:02:18.96377533 +0000 UTC m=+0.105609837 container cleanup 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, tcib_managed=true) Feb 23 05:02:18 localhost systemd[1]: libpod-conmon-5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591.scope: Deactivated successfully. Feb 23 05:02:19 localhost podman[320417]: 2026-02-23 10:02:19.018771017 +0000 UTC m=+0.153028737 container remove 5e57aeaa7d1ab326dfebffba525985903ef608e2be3eaeb26be383063f6f8591 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e63b9444-12b5-401f-bd30-af34ee321bad, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 05:02:19 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:19.045 265541 INFO neutron.agent.dhcp.agent [None req-8ad3a064-f5ac-4d06-b8f5-2b5226166d2f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:02:19 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:19.046 265541 INFO neutron.agent.dhcp.agent [None req-8ad3a064-f5ac-4d06-b8f5-2b5226166d2f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:02:19 localhost ovn_controller[157695]: 2026-02-23T10:02:19Z|00338|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:02:19 localhost nova_compute[282206]: 2026-02-23 10:02:19.264 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:19 localhost systemd[1]: var-lib-containers-storage-overlay-2ebec7039e60db65acdf7c0ecd16e923040d414148abcea66fbe688ace2f9805-merged.mount: Deactivated successfully. Feb 23 05:02:19 localhost systemd[1]: run-netns-qdhcp\x2de63b9444\x2d12b5\x2d401f\x2dbd30\x2daf34ee321bad.mount: Deactivated successfully. Feb 23 05:02:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:02:20 localhost podman[320444]: 2026-02-23 10:02:20.931305053 +0000 UTC m=+0.103515092 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:02:20 localhost podman[320444]: 2026-02-23 10:02:20.947305425 +0000 UTC m=+0.119515474 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible) Feb 23 05:02:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:20 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:02:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:02:22 localhost systemd[1]: tmp-crun.3lyK1j.mount: Deactivated successfully. Feb 23 05:02:22 localhost podman[320463]: 2026-02-23 10:02:22.902299514 +0000 UTC m=+0.082921845 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216) Feb 23 05:02:22 localhost podman[320463]: 2026-02-23 10:02:22.912257627 +0000 UTC m=+0.092879928 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, config_id=ovn_metadata_agent) Feb 23 05:02:22 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:02:23 localhost nova_compute[282206]: 2026-02-23 10:02:23.889 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:24 localhost nova_compute[282206]: 2026-02-23 10:02:24.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:24 localhost nova_compute[282206]: 2026-02-23 10:02:24.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:02:24 localhost nova_compute[282206]: 2026-02-23 10:02:24.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:02:24 localhost nova_compute[282206]: 2026-02-23 10:02:24.144 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:02:24 localhost nova_compute[282206]: 2026-02-23 10:02:24.145 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:02:24 localhost nova_compute[282206]: 2026-02-23 10:02:24.145 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:02:24 localhost nova_compute[282206]: 2026-02-23 10:02:24.145 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:02:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:24 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:25 localhost nova_compute[282206]: 2026-02-23 10:02:25.015 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:02:25 localhost nova_compute[282206]: 2026-02-23 10:02:25.032 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:02:25 localhost nova_compute[282206]: 2026-02-23 10:02:25.033 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:02:25 localhost nova_compute[282206]: 2026-02-23 10:02:25.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:25 localhost nova_compute[282206]: 2026-02-23 10:02:25.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:02:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e218 do_prune osdmap full prune enabled Feb 23 05:02:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 e219: 6 total, 6 up, 6 in Feb 23 05:02:27 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e219: 6 total, 6 up, 6 in Feb 23 05:02:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:02:27 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:27 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:02:28 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:02:28 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Feb 23 05:02:28 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:28 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:28 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:28 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:02:28 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:02:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:02:28 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:28 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:28 localhost nova_compute[282206]: 2026-02-23 10:02:28.892 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:29 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-380228807", "format": "json"} : dispatch Feb 23 05:02:29 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:29 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:29 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-380228807", "caps": ["mds", "allow rw path=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15", "osd", "allow rw pool=manila_data namespace=fsvolumens_bedacb3b-517e-43b1-b025-790f9bc892fc", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:02:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3640977047' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:02:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:02:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3640977047' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:02:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"} v 0) Feb 23 05:02:29 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"} : dispatch Feb 23 05:02:29 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"}]': finished Feb 23 05:02:29 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-380228807,client_metadata.root=/volumes/_nogroup/bedacb3b-517e-43b1-b025-790f9bc892fc/10c0e9c9-eacd-4b8a-ae8b-8483042ecb15],prefix=session evict} (starting...) Feb 23 05:02:30 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-380228807", "format": "json"} : dispatch Feb 23 05:02:30 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"} : dispatch Feb 23 05:02:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"} : dispatch Feb 23 05:02:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-380228807"}]': finished Feb 23 05:02:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:02:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:02:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:02:30 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2519691642' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:02:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:02:30 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2519691642' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:02:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:02:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.051 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.080 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.081 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:02:31 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:31.084 265541 INFO neutron.agent.linux.ip_lib [None req-8791b20c-6006-428b-aee4-272eee9e8751 - - - - - -] Device tap66dde4f3-68 cannot be used as it has no MAC address#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:31 localhost kernel: device tap66dde4f3-68 entered promiscuous mode Feb 23 05:02:31 localhost ovn_controller[157695]: 2026-02-23T10:02:31Z|00339|binding|INFO|Claiming lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 for this chassis. Feb 23 05:02:31 localhost ovn_controller[157695]: 2026-02-23T10:02:31Z|00340|binding|INFO|66dde4f3-68ac-48ac-be4a-678a62b364e3: Claiming unknown Feb 23 05:02:31 localhost NetworkManager[5974]: [1771840951.1190] manager: (tap66dde4f3-68): new Generic device (/org/freedesktop/NetworkManager/Devices/54) Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.120 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:31 localhost systemd-udevd[320577]: Network interface NamePolicy= disabled on kernel command line. Feb 23 05:02:31 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:31.135 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-98deef06-e9d3-4399-8238-57fb5d318b61', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98deef06-e9d3-4399-8238-57fb5d318b61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7d058c2-7406-4067-bd55-39030093b520, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=66dde4f3-68ac-48ac-be4a-678a62b364e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:31 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:31.138 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 66dde4f3-68ac-48ac-be4a-678a62b364e3 in datapath 98deef06-e9d3-4399-8238-57fb5d318b61 bound to our chassis#033[00m Feb 23 05:02:31 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:31.139 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 98deef06-e9d3-4399-8238-57fb5d318b61 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:31 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:31.141 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[c2660d29-08b7-4da3-a0c0-102bc6b99e86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:31 localhost journal[231253]: ethtool ioctl error on tap66dde4f3-68: No such device Feb 23 05:02:31 localhost journal[231253]: ethtool ioctl error on tap66dde4f3-68: No such device Feb 23 05:02:31 localhost ovn_controller[157695]: 2026-02-23T10:02:31Z|00341|binding|INFO|Setting lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 ovn-installed in OVS Feb 23 05:02:31 localhost ovn_controller[157695]: 2026-02-23T10:02:31Z|00342|binding|INFO|Setting lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 up in Southbound Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.166 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:31 localhost journal[231253]: ethtool ioctl error on tap66dde4f3-68: No such device Feb 23 05:02:31 localhost journal[231253]: ethtool ioctl error on tap66dde4f3-68: No such device Feb 23 05:02:31 localhost journal[231253]: ethtool ioctl error on tap66dde4f3-68: No such device Feb 23 05:02:31 localhost journal[231253]: ethtool ioctl error on tap66dde4f3-68: No such device Feb 23 05:02:31 localhost journal[231253]: ethtool ioctl error on tap66dde4f3-68: No such device Feb 23 05:02:31 localhost journal[231253]: ethtool ioctl error on tap66dde4f3-68: No such device Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.216 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.252 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:02:31 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/333892643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.569 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:02:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:02:31 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Feb 23 05:02:31 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.651 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.651 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.744 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.881 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.883 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11269MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.883 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.884 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.970 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.971 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:02:31 localhost nova_compute[282206]: 2026-02-23 10:02:31.972 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:02:32 localhost nova_compute[282206]: 2026-02-23 10:02:32.030 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:02:32 localhost podman[320669]: Feb 23 05:02:32 localhost podman[320669]: 2026-02-23 10:02:32.209731581 +0000 UTC m=+0.111617507 container create fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Feb 23 05:02:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:02:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1163720048' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:02:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:02:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1163720048' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:02:32 localhost podman[320669]: 2026-02-23 10:02:32.153382811 +0000 UTC m=+0.055268767 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:02:32 localhost systemd[1]: Started libpod-conmon-fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0.scope. Feb 23 05:02:32 localhost systemd[1]: Started libcrun container. Feb 23 05:02:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9690d5c0a30798f94ac43c813648f7fb8820202cf10f3cc9168198333355fba5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:02:32 localhost podman[320669]: 2026-02-23 10:02:32.292370335 +0000 UTC m=+0.194256261 container init fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 05:02:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:02:32 localhost podman[320669]: 2026-02-23 10:02:32.304429144 +0000 UTC m=+0.206315070 container start fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:02:32 localhost dnsmasq[320707]: started, version 2.85 cachesize 150 Feb 23 05:02:32 localhost dnsmasq[320707]: DNS service limited to local subnets Feb 23 05:02:32 localhost dnsmasq[320707]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:02:32 localhost dnsmasq[320707]: warning: no upstream servers configured Feb 23 05:02:32 localhost dnsmasq-dhcp[320707]: DHCPv6, static leases only on 2001:db8::, lease time 1d Feb 23 05:02:32 localhost dnsmasq[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/addn_hosts - 0 addresses Feb 23 05:02:32 localhost dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/host Feb 23 05:02:32 localhost dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/opts Feb 23 05:02:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:32.385 265541 INFO neutron.agent.dhcp.agent [None req-8791b20c-6006-428b-aee4-272eee9e8751 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:30Z, description=, device_id=2025587f-5897-4d24-b841-a40c65f335b2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e72eca06-06ff-46ce-af4c-6d430ebe4ef4, ip_allocation=immediate, mac_address=fa:16:3e:eb:88:78, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:28Z, description=, dns_domain=, id=98deef06-e9d3-4399-8238-57fb5d318b61, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1054964184, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50124, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3089, status=ACTIVE, subnets=['460c027b-0b10-4cdf-b00f-448326ec6496'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:29Z, vlan_transparent=None, network_id=98deef06-e9d3-4399-8238-57fb5d318b61, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3097, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:30Z on network 98deef06-e9d3-4399-8238-57fb5d318b61#033[00m Feb 23 05:02:32 localhost podman[320706]: 2026-02-23 10:02:32.405602292 +0000 UTC m=+0.086509668 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, managed_by=edpm_ansible, release=1770267347, version=9.7, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64) Feb 23 05:02:32 localhost podman[320706]: 2026-02-23 10:02:32.420292063 +0000 UTC m=+0.101199439 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-type=git, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, container_name=openstack_network_exporter, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container) Feb 23 05:02:32 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:02:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:02:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:02:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/933428432' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:02:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:32.506 265541 INFO neutron.agent.dhcp.agent [None req-abaa4419-7d88-4741-af99-f9e9c1091cc0 - - - - - -] DHCP configuration for ports {'3e01d07f-b4bf-4688-8209-6f5e13561d8f'} is completed#033[00m Feb 23 05:02:32 localhost nova_compute[282206]: 2026-02-23 10:02:32.524 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:02:32 localhost nova_compute[282206]: 2026-02-23 10:02:32.532 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:02:32 localhost nova_compute[282206]: 2026-02-23 10:02:32.554 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:02:32 localhost nova_compute[282206]: 2026-02-23 10:02:32.559 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:02:32 localhost nova_compute[282206]: 2026-02-23 10:02:32.559 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:02:32 localhost dnsmasq[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/addn_hosts - 1 addresses Feb 23 05:02:32 localhost dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/host Feb 23 05:02:32 localhost dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/opts Feb 23 05:02:32 localhost podman[320754]: 2026-02-23 10:02:32.585283945 +0000 UTC m=+0.052860081 container kill fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:02:32 localhost podman[320738]: 2026-02-23 10:02:32.570756559 +0000 UTC m=+0.113086073 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 05:02:32 localhost podman[320738]: 2026-02-23 10:02:32.654328013 +0000 UTC m=+0.196657487 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 05:02:32 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:02:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:32.753 265541 INFO neutron.agent.dhcp.agent [None req-8791b20c-6006-428b-aee4-272eee9e8751 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:30Z, description=, device_id=2025587f-5897-4d24-b841-a40c65f335b2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e72eca06-06ff-46ce-af4c-6d430ebe4ef4, ip_allocation=immediate, mac_address=fa:16:3e:eb:88:78, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:28Z, description=, dns_domain=, id=98deef06-e9d3-4399-8238-57fb5d318b61, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1054964184, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50124, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3089, status=ACTIVE, subnets=['460c027b-0b10-4cdf-b00f-448326ec6496'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:29Z, vlan_transparent=None, network_id=98deef06-e9d3-4399-8238-57fb5d318b61, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3097, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:30Z on network 98deef06-e9d3-4399-8238-57fb5d318b61#033[00m Feb 23 05:02:32 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:32.870 265541 INFO neutron.agent.dhcp.agent [None req-a7d140a0-b06c-4530-b6e5-24e13ca5e839 - - - - - -] DHCP configuration for ports {'e72eca06-06ff-46ce-af4c-6d430ebe4ef4'} is completed#033[00m Feb 23 05:02:32 localhost dnsmasq[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/addn_hosts - 1 addresses Feb 23 05:02:32 localhost dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/host Feb 23 05:02:32 localhost dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/opts Feb 23 05:02:32 localhost podman[320806]: 2026-02-23 10:02:32.961073647 +0000 UTC m=+0.064920500 container kill fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:02:33 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:33.236 265541 INFO neutron.agent.dhcp.agent [None req-5f53b066-1c39-448a-83f2-27436a925264 - - - - - -] DHCP configuration for ports {'e72eca06-06ff-46ce-af4c-6d430ebe4ef4'} is completed#033[00m Feb 23 05:02:33 localhost nova_compute[282206]: 2026-02-23 10:02:33.895 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0) Feb 23 05:02:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Feb 23 05:02:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Feb 23 05:02:34 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=eve48,client_metadata.root=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a],prefix=session evict} (starting...) Feb 23 05:02:34 localhost nova_compute[282206]: 2026-02-23 10:02:34.758 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:35 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Feb 23 05:02:35 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Feb 23 05:02:35 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Feb 23 05:02:35 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Feb 23 05:02:36 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:36.324 265541 INFO neutron.agent.linux.ip_lib [None req-327dff01-1ecd-4aea-a992-7bfef9900a0e - - - - - -] Device tap6ea52883-11 cannot be used as it has no MAC address#033[00m Feb 23 05:02:36 localhost nova_compute[282206]: 2026-02-23 10:02:36.357 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:36 localhost kernel: device tap6ea52883-11 entered promiscuous mode Feb 23 05:02:36 localhost NetworkManager[5974]: [1771840956.3640] manager: (tap6ea52883-11): new Generic device (/org/freedesktop/NetworkManager/Devices/55) Feb 23 05:02:36 localhost nova_compute[282206]: 2026-02-23 10:02:36.365 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:36 localhost ovn_controller[157695]: 2026-02-23T10:02:36Z|00343|binding|INFO|Claiming lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf for this chassis. Feb 23 05:02:36 localhost ovn_controller[157695]: 2026-02-23T10:02:36Z|00344|binding|INFO|6ea52883-1131-4fad-9e9e-b739d54ff0bf: Claiming unknown Feb 23 05:02:36 localhost systemd-udevd[320836]: Network interface NamePolicy= disabled on kernel command line. Feb 23 05:02:36 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:36.376 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ec4f16f-35f7-432c-b2c5-c12ffa9973a2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6ea52883-1131-4fad-9e9e-b739d54ff0bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:36 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:36.378 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6ea52883-1131-4fad-9e9e-b739d54ff0bf in datapath d54c9f86-28a1-4f1b-8617-dc63ba0e4fee bound to our chassis#033[00m Feb 23 05:02:36 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:36.380 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d54c9f86-28a1-4f1b-8617-dc63ba0e4fee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:36 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:36.381 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[5498c26e-6729-41d5-a965-1167241240fa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:36 localhost journal[231253]: ethtool ioctl error on tap6ea52883-11: No such device Feb 23 05:02:36 localhost journal[231253]: ethtool ioctl error on tap6ea52883-11: No such device Feb 23 05:02:36 localhost journal[231253]: ethtool ioctl error on tap6ea52883-11: No such device Feb 23 05:02:36 localhost ovn_controller[157695]: 2026-02-23T10:02:36Z|00345|binding|INFO|Setting lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf ovn-installed in OVS Feb 23 05:02:36 localhost ovn_controller[157695]: 2026-02-23T10:02:36Z|00346|binding|INFO|Setting lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf up in Southbound Feb 23 05:02:36 localhost journal[231253]: ethtool ioctl error on tap6ea52883-11: No such device Feb 23 05:02:36 localhost nova_compute[282206]: 2026-02-23 10:02:36.405 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:36 localhost journal[231253]: ethtool ioctl error on tap6ea52883-11: No such device Feb 23 05:02:36 localhost journal[231253]: ethtool ioctl error on tap6ea52883-11: No such device Feb 23 05:02:36 localhost journal[231253]: ethtool ioctl error on tap6ea52883-11: No such device Feb 23 05:02:36 localhost journal[231253]: ethtool ioctl error on tap6ea52883-11: No such device Feb 23 05:02:36 localhost nova_compute[282206]: 2026-02-23 10:02:36.454 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:36 localhost nova_compute[282206]: 2026-02-23 10:02:36.487 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:36 localhost nova_compute[282206]: 2026-02-23 10:02:36.562 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:36 localhost nova_compute[282206]: 2026-02-23 10:02:36.562 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:37 localhost podman[320907]: Feb 23 05:02:37 localhost podman[320907]: 2026-02-23 10:02:37.400361415 +0000 UTC m=+0.093296641 container create dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 05:02:37 localhost systemd[1]: Started libpod-conmon-dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b.scope. Feb 23 05:02:37 localhost podman[320907]: 2026-02-23 10:02:37.354495374 +0000 UTC m=+0.047430630 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:02:37 localhost systemd[1]: Started libcrun container. Feb 23 05:02:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7bb5f63af1888222c2e9d63660f7e53fc4d9a2bdaccc92e1e2b415de8d1e37c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:02:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:37 localhost podman[320907]: 2026-02-23 10:02:37.475267387 +0000 UTC m=+0.168202613 container init dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 05:02:37 localhost podman[320907]: 2026-02-23 10:02:37.491317061 +0000 UTC m=+0.184252297 container start dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 23 05:02:37 localhost dnsmasq[320925]: started, version 2.85 cachesize 150 Feb 23 05:02:37 localhost dnsmasq[320925]: DNS service limited to local subnets Feb 23 05:02:37 localhost dnsmasq[320925]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:02:37 localhost dnsmasq[320925]: warning: no upstream servers configured Feb 23 05:02:37 localhost dnsmasq-dhcp[320925]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d Feb 23 05:02:37 localhost dnsmasq[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/addn_hosts - 0 addresses Feb 23 05:02:37 localhost dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/host Feb 23 05:02:37 localhost dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/opts Feb 23 05:02:37 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:37.557 265541 INFO neutron.agent.dhcp.agent [None req-327dff01-1ecd-4aea-a992-7bfef9900a0e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:36Z, description=, device_id=2025587f-5897-4d24-b841-a40c65f335b2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=799df518-da07-48da-9a80-2682e9892d43, ip_allocation=immediate, mac_address=fa:16:3e:1e:ee:b3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:34Z, description=, dns_domain=, id=d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1408104542, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=189, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3119, status=ACTIVE, subnets=['43370c10-c3d6-4ec9-9149-8ef541cda21c'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:35Z, vlan_transparent=None, network_id=d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3134, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:36Z on network d54c9f86-28a1-4f1b-8617-dc63ba0e4fee#033[00m Feb 23 05:02:37 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:37.720 265541 INFO neutron.agent.dhcp.agent [None req-4e58f5dc-c733-4bb1-8f48-2ecc514dc88b - - - - - -] DHCP configuration for ports {'97001ae9-cd83-49b2-b11f-40e665a87f8f'} is completed#033[00m Feb 23 05:02:37 localhost dnsmasq[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/addn_hosts - 1 addresses Feb 23 05:02:37 localhost dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/host Feb 23 05:02:37 localhost podman[320944]: 2026-02-23 10:02:37.750125299 +0000 UTC m=+0.061109779 container kill dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.build-date=20260216, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:02:37 localhost dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/opts Feb 23 05:02:37 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:37.807 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:37 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:37.808 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:02:37 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:37.810 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:02:37 localhost nova_compute[282206]: 2026-02-23 10:02:37.846 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:38 localhost nova_compute[282206]: 2026-02-23 10:02:38.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:02:38 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:38.069 265541 INFO neutron.agent.dhcp.agent [None req-ff4c2e13-8745-4e56-b17d-5ec0d170c869 - - - - - -] DHCP configuration for ports {'799df518-da07-48da-9a80-2682e9892d43'} is completed#033[00m Feb 23 05:02:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:02:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:38 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Feb 23 05:02:38 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:38 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:38 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a", "osd", "allow rw pool=manila_data namespace=fsvolumens_ff275bf0-7ab0-4200-9dfe-d972931f7856", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:38 localhost systemd[1]: tmp-crun.mWt77a.mount: Deactivated successfully. Feb 23 05:02:38 localhost nova_compute[282206]: 2026-02-23 10:02:38.934 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:39 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:39.108 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:36Z, description=, device_id=2025587f-5897-4d24-b841-a40c65f335b2, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=799df518-da07-48da-9a80-2682e9892d43, ip_allocation=immediate, mac_address=fa:16:3e:1e:ee:b3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:34Z, description=, dns_domain=, id=d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1408104542, port_security_enabled=True, project_id=68a48b471ed84048aeb651374fff5111, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=189, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3119, status=ACTIVE, subnets=['43370c10-c3d6-4ec9-9149-8ef541cda21c'], tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:35Z, vlan_transparent=None, network_id=d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, port_security_enabled=False, project_id=68a48b471ed84048aeb651374fff5111, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3134, status=DOWN, tags=[], tenant_id=68a48b471ed84048aeb651374fff5111, updated_at=2026-02-23T10:02:36Z on network d54c9f86-28a1-4f1b-8617-dc63ba0e4fee#033[00m Feb 23 05:02:39 localhost dnsmasq[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/addn_hosts - 1 addresses Feb 23 05:02:39 localhost dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/host Feb 23 05:02:39 localhost podman[320983]: 2026-02-23 10:02:39.305286472 +0000 UTC m=+0.046207783 container kill dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:02:39 localhost dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/opts Feb 23 05:02:39 localhost systemd[1]: tmp-crun.g8vMed.mount: Deactivated successfully. Feb 23 05:02:39 localhost podman[242954]: time="2026-02-23T10:02:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:02:39 localhost podman[242954]: @ - - [23/Feb/2026:10:02:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160717 "" "Go-http-client/1.1" Feb 23 05:02:39 localhost podman[242954]: @ - - [23/Feb/2026:10:02:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19781 "" "Go-http-client/1.1" Feb 23 05:02:39 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:39.553 265541 INFO neutron.agent.dhcp.agent [None req-094e1568-2d91-4ecc-8884-503d20b890be - - - - - -] DHCP configuration for ports {'799df518-da07-48da-9a80-2682e9892d43'} is completed#033[00m Feb 23 05:02:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0) Feb 23 05:02:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Feb 23 05:02:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Feb 23 05:02:41 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=eve47,client_metadata.root=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a],prefix=session evict} (starting...) Feb 23 05:02:42 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Feb 23 05:02:42 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Feb 23 05:02:42 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Feb 23 05:02:42 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Feb 23 05:02:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:42 localhost nova_compute[282206]: 2026-02-23 10:02:42.896 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:43 localhost nova_compute[282206]: 2026-02-23 10:02:43.129 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:43 localhost openstack_network_exporter[245358]: ERROR 10:02:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:02:43 localhost openstack_network_exporter[245358]: Feb 23 05:02:43 localhost openstack_network_exporter[245358]: ERROR 10:02:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:02:43 localhost openstack_network_exporter[245358]: Feb 23 05:02:43 localhost nova_compute[282206]: 2026-02-23 10:02:43.970 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:02:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:02:45 localhost podman[321004]: 2026-02-23 10:02:45.909008777 +0000 UTC m=+0.081897383 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 05:02:45 localhost podman[321004]: 2026-02-23 10:02:45.99318957 +0000 UTC m=+0.166078136 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Feb 23 05:02:46 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:02:46 localhost podman[321005]: 2026-02-23 10:02:45.997492525 +0000 UTC m=+0.164267149 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 05:02:46 localhost podman[321005]: 2026-02-23 10:02:46.080374438 +0000 UTC m=+0.247148992 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 05:02:46 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:02:46 localhost nova_compute[282206]: 2026-02-23 10:02:46.120 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:46 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0) Feb 23 05:02:46 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Feb 23 05:02:46 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Feb 23 05:02:46 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=eve49,client_metadata.root=/volumes/_nogroup/ff275bf0-7ab0-4200-9dfe-d972931f7856/b70c063c-69eb-4a53-8ff2-f992c2e4cf5a],prefix=session evict} (starting...) Feb 23 05:02:47 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Feb 23 05:02:47 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Feb 23 05:02:47 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Feb 23 05:02:47 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Feb 23 05:02:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:47 localhost ovn_controller[157695]: 2026-02-23T10:02:47Z|00347|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:02:47 localhost nova_compute[282206]: 2026-02-23 10:02:47.696 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:48 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e219 do_prune osdmap full prune enabled Feb 23 05:02:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e220 e220: 6 total, 6 up, 6 in Feb 23 05:02:48 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e220: 6 total, 6 up, 6 in Feb 23 05:02:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:48.561 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:02:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:48.562 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:02:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:48.562 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:02:48 localhost dnsmasq[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/addn_hosts - 0 addresses Feb 23 05:02:48 localhost dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/host Feb 23 05:02:48 localhost systemd[1]: tmp-crun.bYJ5YE.mount: Deactivated successfully. Feb 23 05:02:48 localhost dnsmasq-dhcp[320925]: read /var/lib/neutron/dhcp/d54c9f86-28a1-4f1b-8617-dc63ba0e4fee/opts Feb 23 05:02:48 localhost podman[321070]: 2026-02-23 10:02:48.752692315 +0000 UTC m=+0.073328593 container kill dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:02:48 localhost ovn_controller[157695]: 2026-02-23T10:02:48Z|00348|binding|INFO|Releasing lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf from this chassis (sb_readonly=0) Feb 23 05:02:48 localhost nova_compute[282206]: 2026-02-23 10:02:48.935 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:48 localhost kernel: device tap6ea52883-11 left promiscuous mode Feb 23 05:02:48 localhost ovn_controller[157695]: 2026-02-23T10:02:48Z|00349|binding|INFO|Setting lport 6ea52883-1131-4fad-9e9e-b739d54ff0bf down in Southbound Feb 23 05:02:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:48.948 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ec4f16f-35f7-432c-b2c5-c12ffa9973a2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6ea52883-1131-4fad-9e9e-b739d54ff0bf) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:48.950 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 6ea52883-1131-4fad-9e9e-b739d54ff0bf in datapath d54c9f86-28a1-4f1b-8617-dc63ba0e4fee unbound from our chassis#033[00m Feb 23 05:02:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:48.952 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d54c9f86-28a1-4f1b-8617-dc63ba0e4fee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:48.953 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[642d2195-6ec0-4b74-bfd4-9d3d02c05fdc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:48 localhost nova_compute[282206]: 2026-02-23 10:02:48.956 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:48 localhost nova_compute[282206]: 2026-02-23 10:02:48.957 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:48 localhost nova_compute[282206]: 2026-02-23 10:02:48.972 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:49 localhost dnsmasq[320925]: exiting on receipt of SIGTERM Feb 23 05:02:49 localhost podman[321111]: 2026-02-23 10:02:49.636651026 +0000 UTC m=+0.059323894 container kill dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0) Feb 23 05:02:49 localhost systemd[1]: libpod-dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b.scope: Deactivated successfully. Feb 23 05:02:49 localhost podman[321123]: 2026-02-23 10:02:49.696580129 +0000 UTC m=+0.049863887 container died dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 23 05:02:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b-userdata-shm.mount: Deactivated successfully. Feb 23 05:02:49 localhost systemd[1]: var-lib-containers-storage-overlay-a7bb5f63af1888222c2e9d63660f7e53fc4d9a2bdaccc92e1e2b415de8d1e37c-merged.mount: Deactivated successfully. Feb 23 05:02:49 localhost podman[321123]: 2026-02-23 10:02:49.780921588 +0000 UTC m=+0.134205306 container cleanup dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0) Feb 23 05:02:49 localhost systemd[1]: libpod-conmon-dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b.scope: Deactivated successfully. Feb 23 05:02:49 localhost podman[321130]: 2026-02-23 10:02:49.807789552 +0000 UTC m=+0.151102607 container remove dfad4dd845bc7ed39e88d033b62f05ceb63aa1d449b0ae6927f1ad41ecc2da1b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d54c9f86-28a1-4f1b-8617-dc63ba0e4fee, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0) Feb 23 05:02:49 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:49.831 265541 INFO neutron.agent.dhcp.agent [None req-7423510d-ec8a-4e02-ac6f-9b7ca928647d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:02:49 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:49.890 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:02:50 localhost ovn_controller[157695]: 2026-02-23T10:02:50Z|00350|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:02:50 localhost nova_compute[282206]: 2026-02-23 10:02:50.221 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:50 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:50 localhost systemd[1]: run-netns-qdhcp\x2dd54c9f86\x2d28a1\x2d4f1b\x2d8617\x2ddc63ba0e4fee.mount: Deactivated successfully. Feb 23 05:02:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:50 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:02:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:02:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:51 localhost podman[321153]: 2026-02-23 10:02:51.922768695 +0000 UTC m=+0.095993327 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:02:51 localhost podman[321153]: 2026-02-23 10:02:51.961786269 +0000 UTC m=+0.135010861 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216) Feb 23 05:02:51 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:02:52 localhost ovn_controller[157695]: 2026-02-23T10:02:52Z|00351|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:02:52 localhost nova_compute[282206]: 2026-02-23 10:02:52.072 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:52 localhost podman[321190]: 2026-02-23 10:02:52.638811952 +0000 UTC m=+0.061462891 container kill fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 05:02:52 localhost dnsmasq[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/addn_hosts - 0 addresses Feb 23 05:02:52 localhost dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/host Feb 23 05:02:52 localhost dnsmasq-dhcp[320707]: read /var/lib/neutron/dhcp/98deef06-e9d3-4399-8238-57fb5d318b61/opts Feb 23 05:02:52 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:02:52 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:52 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:02:52 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:02:52 localhost ovn_controller[157695]: 2026-02-23T10:02:52Z|00352|binding|INFO|Releasing lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 from this chassis (sb_readonly=0) Feb 23 05:02:52 localhost nova_compute[282206]: 2026-02-23 10:02:52.807 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:52 localhost kernel: device tap66dde4f3-68 left promiscuous mode Feb 23 05:02:52 localhost ovn_controller[157695]: 2026-02-23T10:02:52Z|00353|binding|INFO|Setting lport 66dde4f3-68ac-48ac-be4a-678a62b364e3 down in Southbound Feb 23 05:02:52 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:52.820 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-98deef06-e9d3-4399-8238-57fb5d318b61', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-98deef06-e9d3-4399-8238-57fb5d318b61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68a48b471ed84048aeb651374fff5111', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7d058c2-7406-4067-bd55-39030093b520, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=66dde4f3-68ac-48ac-be4a-678a62b364e3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:52 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:52.822 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 66dde4f3-68ac-48ac-be4a-678a62b364e3 in datapath 98deef06-e9d3-4399-8238-57fb5d318b61 unbound from our chassis#033[00m Feb 23 05:02:52 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:52.823 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 98deef06-e9d3-4399-8238-57fb5d318b61 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:52 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:52.824 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[f2a5ed11-65f3-425c-a855-85a93d657b86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:52 localhost nova_compute[282206]: 2026-02-23 10:02:52.828 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:53 localhost podman[321229]: 2026-02-23 10:02:53.096390862 +0000 UTC m=+0.059649824 container kill fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Feb 23 05:02:53 localhost dnsmasq[320707]: exiting on receipt of SIGTERM Feb 23 05:02:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:02:53 localhost systemd[1]: libpod-fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0.scope: Deactivated successfully. Feb 23 05:02:53 localhost podman[321245]: 2026-02-23 10:02:53.175078534 +0000 UTC m=+0.058796607 container died fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 05:02:53 localhost podman[321245]: 2026-02-23 10:02:53.25709587 +0000 UTC m=+0.140813943 container cleanup fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 05:02:53 localhost systemd[1]: libpod-conmon-fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0.scope: Deactivated successfully. Feb 23 05:02:53 localhost podman[321244]: 2026-02-23 10:02:53.279531315 +0000 UTC m=+0.162106013 container remove fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-98deef06-e9d3-4399-8238-57fb5d318b61, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2) Feb 23 05:02:53 localhost podman[321256]: 2026-02-23 10:02:53.233268691 +0000 UTC m=+0.103128749 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true) Feb 23 05:02:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:53.301 265541 INFO neutron.agent.dhcp.agent [None req-1ee41c73-7427-4721-afaa-d17654462885 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:02:53 localhost podman[321256]: 2026-02-23 10:02:53.315224456 +0000 UTC m=+0.185084504 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:02:53 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:53.316 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:02:53 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:02:53 localhost ovn_controller[157695]: 2026-02-23T10:02:53Z|00354|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:02:53 localhost nova_compute[282206]: 2026-02-23 10:02:53.570 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e220 do_prune osdmap full prune enabled Feb 23 05:02:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e221 e221: 6 total, 6 up, 6 in Feb 23 05:02:53 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e221: 6 total, 6 up, 6 in Feb 23 05:02:53 localhost nova_compute[282206]: 2026-02-23 10:02:53.975 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:54 localhost systemd[1]: var-lib-containers-storage-overlay-9690d5c0a30798f94ac43c813648f7fb8820202cf10f3cc9168198333355fba5-merged.mount: Deactivated successfully. Feb 23 05:02:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc50d34ac72d7e730c365dd0ac6cb0b580f34277c1783bdebb358ec1942d59e0-userdata-shm.mount: Deactivated successfully. Feb 23 05:02:54 localhost systemd[1]: run-netns-qdhcp\x2d98deef06\x2de9d3\x2d4399\x2d8238\x2d57fb5d318b61.mount: Deactivated successfully. Feb 23 05:02:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 23 05:02:55 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:02:55 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:02:55 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:02:55 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:55.327 265541 INFO neutron.agent.linux.ip_lib [None req-b28e6ae9-386e-4e32-8d1d-2e2e7f5f33ab - - - - - -] Device tap76d8ca35-ba cannot be used as it has no MAC address#033[00m Feb 23 05:02:55 localhost nova_compute[282206]: 2026-02-23 10:02:55.400 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:55 localhost kernel: device tap76d8ca35-ba entered promiscuous mode Feb 23 05:02:55 localhost NetworkManager[5974]: [1771840975.4076] manager: (tap76d8ca35-ba): new Generic device (/org/freedesktop/NetworkManager/Devices/56) Feb 23 05:02:55 localhost nova_compute[282206]: 2026-02-23 10:02:55.411 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:55 localhost ovn_controller[157695]: 2026-02-23T10:02:55Z|00355|binding|INFO|Claiming lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d for this chassis. Feb 23 05:02:55 localhost ovn_controller[157695]: 2026-02-23T10:02:55Z|00356|binding|INFO|76d8ca35-bae5-4d5e-827a-3f91e4067a0d: Claiming unknown Feb 23 05:02:55 localhost systemd-udevd[321301]: Network interface NamePolicy= disabled on kernel command line. Feb 23 05:02:55 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:55.420 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-255d578f-65f8-4643-b21d-1ec8d68e886d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-255d578f-65f8-4643-b21d-1ec8d68e886d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a24474b213514491beaa97b54bfd695f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40e0ff5c-1777-4eef-b7ef-c7cac136f27e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=76d8ca35-bae5-4d5e-827a-3f91e4067a0d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:02:55 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:55.422 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 76d8ca35-bae5-4d5e-827a-3f91e4067a0d in datapath 255d578f-65f8-4643-b21d-1ec8d68e886d bound to our chassis#033[00m Feb 23 05:02:55 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:55.424 163572 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 255d578f-65f8-4643-b21d-1ec8d68e886d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Feb 23 05:02:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:02:55 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:02:55 localhost ovn_metadata_agent[163567]: 2026-02-23 10:02:55.425 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b58cd471-011e-478d-9fe7-8c391fef9a92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:02:55 localhost ovn_controller[157695]: 2026-02-23T10:02:55Z|00357|binding|INFO|Setting lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d ovn-installed in OVS Feb 23 05:02:55 localhost ovn_controller[157695]: 2026-02-23T10:02:55Z|00358|binding|INFO|Setting lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d up in Southbound Feb 23 05:02:55 localhost nova_compute[282206]: 2026-02-23 10:02:55.457 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:55 localhost nova_compute[282206]: 2026-02-23 10:02:55.497 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:55 localhost nova_compute[282206]: 2026-02-23 10:02:55.525 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:55 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:02:55 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:02:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:02:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.145 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.176 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.177 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cdf7cb8-a976-4e03-94f5-494e706a1b4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.148214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3571dd4-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'a0cd51b5e4954fa65f7d95ee08514dbbc9b78f09af51e68e7f694de73806d46c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.148214', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd3573ba2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '6acc43618268c804f827c5fa4826f895ed11d32d8b5fd1716146a78541120942'}]}, 'timestamp': '2026-02-23 10:02:56.178529', '_unique_id': 'c9cfea9bff464815ae1e553d92844893'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.180 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.188 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43cbea76-8aa7-41fb-9e42-df147f6c755f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.185449', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd358dca0-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'ad8cf752041775708432b4a9f6356e7e30257c5154a6f608ab036541c5366d8c'}]}, 'timestamp': '2026-02-23 10:02:56.189246', '_unique_id': '508721252e65430fa5fa85508fefbd36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.190 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.192 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14fe9efe-8cec-4dba-9809-a3ea031b8853', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.192110', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd359613e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'ca2ebc0f119fc0d09f925ed09a2a28492c2e26cd9e2959a398dfcd7f2d1c04ba'}]}, 'timestamp': '2026-02-23 10:02:56.192573', '_unique_id': 'fe11d7b5a819423c8b1f17ad8e182e27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.194 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.194 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a14eddc-3143-45eb-b4d7-402591b39b06', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.194943', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd359d074-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': '31b9eb376ba0cbcbbc81d3c34497f3cd784a1c629bd16f7a8eec4a95428e20b4'}]}, 'timestamp': '2026-02-23 10:02:56.195418', '_unique_id': 'b033fb7ab651476095445ca9676e1a50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.197 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a10e779f-40f9-4e46-a176-11c5f44176a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.197481', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd35a3294-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': '76515aa2fee1d97193b5d2dd6579aeac4a694f26d7dd23e8ac3773be3b235e44'}]}, 'timestamp': '2026-02-23 10:02:56.197957', '_unique_id': '10403af646924f91b45751c7ffba1a77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.199 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.200 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.212 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.213 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efcbf453-010f-4e90-86a8-56fadc6755ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.200415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd35c7e46-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': 'd8fc60cea31083e833ab74b995f1a1de7984070c2e1d4f8aed6a54bb06047e58'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.200415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd35c9408-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': 'da3c13a1c5e61f01dfe43b3c3e60431c8440a41287851a95970dbebed6aa52e1'}]}, 'timestamp': '2026-02-23 10:02:56.213533', '_unique_id': '955ea825f13743c984abc9a1a5c2fb78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.214 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.216 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc7a3ce4-7267-4093-af9c-d64bb5938198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.216858', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd35d2abc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': '2ae6fcf75e455b02fe7e3af6a600c905cc79d09cdc8fc3d1b3df6a7e6b3279c9'}]}, 'timestamp': '2026-02-23 10:02:56.217428', '_unique_id': '3d4aacfa14554a0aa0cf411899facf1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.218 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.219 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b973a784-160e-4f9c-ae51-33fb7fc49351', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:02:56.219695', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd35ffb84-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.42464927, 'message_signature': 'ecb865c306313e265f96014b66b326e0f275b77db54565fc74839bdae8ca34a8'}]}, 'timestamp': '2026-02-23 10:02:56.235927', '_unique_id': '7e4a55183fc84cda826d07926fa75c59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ddde6fe-8cf3-42a6-8283-77b08fc9b52b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.238390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3607230-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '71de84f98e2d8abe39ca4046b60bf1e89338c655749661b9b2d0ab2cefaf5c7b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.238390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd360854a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '5ed6ab7fc13a55558124e060455e244dafe877584591848b2bb3decdb8b6fd17'}]}, 'timestamp': '2026-02-23 10:02:56.239369', '_unique_id': '1f4f22fd9794477db04d1d7094407f22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.240 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.241 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.241 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 15480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ac3b462-604a-46f3-ac76-1b5696358c8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15480000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:02:56.241766', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd360f5fc-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.42464927, 'message_signature': '572ec925621cd95ece6559dafea91e5ca47ae6749aa1878ff111e0f218f6fff5'}]}, 'timestamp': '2026-02-23 10:02:56.242241', '_unique_id': '27b95adbb76447ddb9c154021b80cc26'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.243 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.244 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df1b5b4a-27aa-4521-8eff-05721a8aaf27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.244349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3615966-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': '0ac2e4fab49bc1e4d78b7a0c3f153a1b6178511e97bdd5c72e5009dbf57cfd2a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.244349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd3616a8c-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': '1e1335aba2dc594545e8b1579adaa677bf90f996cb1da4b45c7071692237ac60'}]}, 'timestamp': '2026-02-23 10:02:56.245206', '_unique_id': '32d9ce3748e34727b984438d30c56aea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.246 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '63cc1be6-9465-4d51-93c7-6525eea786c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.247334', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd361ce46-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'bbeeca9a0b363e2785cd29fbdf531af0220d22cb7977e94b85a9cb73a7630422'}]}, 'timestamp': '2026-02-23 10:02:56.247787', '_unique_id': '79489717e6904685ae1a2b7e697f4f21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8dcf3bc7-9eb6-4cb1-96f9-f07407ce3cb0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.250069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd3623a2a-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': 'ea01cd254ade0b1af95735d91743ae085fcbcb15f5dfc25eb3896b11a8d2ac7f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.250069', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd3624ace-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.38991283, 'message_signature': 'cbaf51a6080dab767a54516cb0c568a0156fc835c10b8465f7de1fd236b1cf37'}]}, 'timestamp': '2026-02-23 10:02:56.251011', '_unique_id': '83e8109dc551400f94247b888a56b785'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d372c91-d7cd-4425-877e-058a41d346e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.253261', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd362b64e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'fa2ebf87d0d2b3803513d0cea1b589406daacc41a4e9e29f28213cbb35348cfd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.253261', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd362c7d8-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'c8406a2536e3aa81afd8a03f3c49ecb6a5aa62c7d65cd1aa3e73bebfef152be5'}]}, 'timestamp': '2026-02-23 10:02:56.254254', '_unique_id': 'ab928cbf063f47d8a242f5ea9cbfa810'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd90809da-2060-4729-80a0-908a50c6d945', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.256083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd36320c0-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'dff518270c1a8239c5a7a33510a7298c64abc8a57e6eec293cca92e745ddcc43'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.256083', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd3632a98-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '1a4b468e0dcce1e3fcfe3e13084d1e00ee58a9519b1c50d63a83f4ee2856bb85'}]}, 'timestamp': '2026-02-23 10:02:56.256602', '_unique_id': 'e3893ca2419b4c4bb814ce9573fb5f7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a93b2715-e474-4583-83ed-12f86decf3b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.258071', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd3636ec2-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'cca4879a5c692d1855cf2d6868d461ccb0312b5c7b1e1ee576f52c1e7e84f334'}]}, 'timestamp': '2026-02-23 10:02:56.258366', '_unique_id': 'b8ad34c3a0aa4cbd9135afd52cea9905'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.259 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.260 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd523ec54-bac3-45d2-8a06-84f5a7a7071b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.259684', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd363b5c6-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '6d58e21173d4ba46775558db7988d7bef82b67e249f2574dafb99d514e2c00f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.259684', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd363c12e-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'ca07201e9789a2900b8d96575c1332636d96ea2d3c659a576b6bcc536969c389'}]}, 'timestamp': '2026-02-23 10:02:56.260459', '_unique_id': 'ff0ba07f924a45d9bc8c63a92e7ce8b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.261 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6df9049-00cb-4b43-9fd2-ecda04f9da4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.261852', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd3640328-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'd164512031e3245861ed1b7746edbe7d92eb60797457e5e3d16cc4c32a03e508'}]}, 'timestamp': '2026-02-23 10:02:56.262166', '_unique_id': 'e30e6e11838c435783abf4b0658dfc86'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.262 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '121b8d96-e3e1-4fc1-9553-55368b3a7103', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.263479', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd36441a8-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': '0668e5d4f46e84fa6f1b304de3c5ffcd29c0b87d5204c6af148cebd62eb50c5b'}]}, 'timestamp': '2026-02-23 10:02:56.263764', '_unique_id': '89d86ba75df04dc18092da5c7edea324'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.264 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.265 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df63c6a8-40c8-49f8-9c1d-5270cfc0b67b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:02:56.265130', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'd36483ac-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.374995031, 'message_signature': 'f0e9c6878060d2b1461f306d5810a51e38bf390d02a0c828682ad8becaa8afe1'}]}, 'timestamp': '2026-02-23 10:02:56.265459', '_unique_id': '1f75408ebb1a440888f2f6bbdcfc996f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.267 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b93d42d-28f5-4295-b2cd-0b9bd7d37a2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:02:56.266787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd364c498-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': '440d827f36cfdd0f80599110373a761b050019d79348cfa4ee0342454664a9de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:02:56.266787', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd364cf74-109e-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12416.337932157, 'message_signature': 'dd4bf841624c479d1d1460d930c2c4cf2e31c2404db385c692704545b93178ec'}]}, 'timestamp': '2026-02-23 10:02:56.267379', '_unique_id': 'cd78a33affac4591be71e707c60a3452'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:02:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:02:56.268 12 ERROR oslo_messaging.notify.messaging Feb 23 05:02:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:02:56 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/824066865' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:02:56 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:02:56 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/824066865' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:02:56 localhost podman[321354]: Feb 23 05:02:56 localhost podman[321354]: 2026-02-23 10:02:56.399131837 +0000 UTC m=+0.085463084 container create 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:02:56 localhost systemd[1]: Started libpod-conmon-36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2.scope. Feb 23 05:02:56 localhost systemd[1]: tmp-crun.EcdjRn.mount: Deactivated successfully. Feb 23 05:02:56 localhost systemd[1]: Started libcrun container. Feb 23 05:02:56 localhost podman[321354]: 2026-02-23 10:02:56.358803581 +0000 UTC m=+0.045134858 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:02:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da48425f8f699e12a206073870f9c35f47d4c0c8337a2b7589ceef2e3b973683/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:02:56 localhost podman[321354]: 2026-02-23 10:02:56.471430658 +0000 UTC m=+0.157761955 container init 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 05:02:56 localhost podman[321354]: 2026-02-23 10:02:56.480526384 +0000 UTC m=+0.166857641 container start 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 05:02:56 localhost dnsmasq[321373]: started, version 2.85 cachesize 150 Feb 23 05:02:56 localhost dnsmasq[321373]: DNS service limited to local subnets Feb 23 05:02:56 localhost dnsmasq[321373]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:02:56 localhost dnsmasq[321373]: warning: no upstream servers configured Feb 23 05:02:56 localhost dnsmasq-dhcp[321373]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 05:02:56 localhost dnsmasq[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/addn_hosts - 0 addresses Feb 23 05:02:56 localhost dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/host Feb 23 05:02:56 localhost dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/opts Feb 23 05:02:56 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:56.612 265541 INFO neutron.agent.dhcp.agent [None req-32fbde91-026b-45f8-a38b-a43fcbce574f - - - - - -] DHCP configuration for ports {'7a54f1f1-bb03-42b1-8518-feb534fb6ef9'} is completed#033[00m Feb 23 05:02:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e221 do_prune osdmap full prune enabled Feb 23 05:02:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e222 e222: 6 total, 6 up, 6 in Feb 23 05:02:57 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e222: 6 total, 6 up, 6 in Feb 23 05:02:57 localhost nova_compute[282206]: 2026-02-23 10:02:57.405 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:02:58 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:58.280 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:57Z, description=, device_id=f2f8a48a-d698-4e75-9a87-03e5d2e9729d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=371d162a-b0cd-4d91-9745-94e1cb8296bd, ip_allocation=immediate, mac_address=fa:16:3e:c4:bf:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:53Z, description=, dns_domain=, id=255d578f-65f8-4643-b21d-1ec8d68e886d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-1698092807-network, port_security_enabled=True, project_id=a24474b213514491beaa97b54bfd695f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57786, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3274, status=ACTIVE, subnets=['45a769dd-5712-4113-814d-773e2ed0a09c'], tags=[], tenant_id=a24474b213514491beaa97b54bfd695f, updated_at=2026-02-23T10:02:54Z, vlan_transparent=None, network_id=255d578f-65f8-4643-b21d-1ec8d68e886d, port_security_enabled=False, project_id=a24474b213514491beaa97b54bfd695f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3287, status=DOWN, tags=[], tenant_id=a24474b213514491beaa97b54bfd695f, updated_at=2026-02-23T10:02:58Z on network 255d578f-65f8-4643-b21d-1ec8d68e886d#033[00m Feb 23 05:02:58 localhost systemd[1]: tmp-crun.eExSE7.mount: Deactivated successfully. Feb 23 05:02:58 localhost dnsmasq[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/addn_hosts - 1 addresses Feb 23 05:02:58 localhost dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/host Feb 23 05:02:58 localhost podman[321391]: 2026-02-23 10:02:58.911898643 +0000 UTC m=+0.461578897 container kill 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216) Feb 23 05:02:58 localhost dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/opts Feb 23 05:02:58 localhost nova_compute[282206]: 2026-02-23 10:02:58.977 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:02:59 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:59.203 265541 INFO neutron.agent.dhcp.agent [None req-9bc4a8b4-f693-4fe5-b1db-532a094aa1dd - - - - - -] DHCP configuration for ports {'371d162a-b0cd-4d91-9745-94e1cb8296bd'} is completed#033[00m Feb 23 05:02:59 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e48: np0005626465.hlpkwo(active, since 11m), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 05:02:59 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:02:59.819 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:02:57Z, description=, device_id=f2f8a48a-d698-4e75-9a87-03e5d2e9729d, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=371d162a-b0cd-4d91-9745-94e1cb8296bd, ip_allocation=immediate, mac_address=fa:16:3e:c4:bf:87, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:02:53Z, description=, dns_domain=, id=255d578f-65f8-4643-b21d-1ec8d68e886d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-1698092807-network, port_security_enabled=True, project_id=a24474b213514491beaa97b54bfd695f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57786, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3274, status=ACTIVE, subnets=['45a769dd-5712-4113-814d-773e2ed0a09c'], tags=[], tenant_id=a24474b213514491beaa97b54bfd695f, updated_at=2026-02-23T10:02:54Z, vlan_transparent=None, network_id=255d578f-65f8-4643-b21d-1ec8d68e886d, port_security_enabled=False, project_id=a24474b213514491beaa97b54bfd695f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3287, status=DOWN, tags=[], tenant_id=a24474b213514491beaa97b54bfd695f, updated_at=2026-02-23T10:02:58Z on network 255d578f-65f8-4643-b21d-1ec8d68e886d#033[00m Feb 23 05:03:00 localhost dnsmasq[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/addn_hosts - 1 addresses Feb 23 05:03:00 localhost dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/host Feb 23 05:03:00 localhost dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/opts Feb 23 05:03:00 localhost podman[321431]: 2026-02-23 10:03:00.037845373 +0000 UTC m=+0.067767149 container kill 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0) Feb 23 05:03:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:03:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:00 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:03:00.254 265541 INFO neutron.agent.dhcp.agent [None req-50ec6260-869f-47ce-bb87-d1f6423194c0 - - - - - -] DHCP configuration for ports {'371d162a-b0cd-4d91-9745-94e1cb8296bd'} is completed#033[00m Feb 23 05:03:00 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:03:00 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:00 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:00 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:00 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e222 do_prune osdmap full prune enabled Feb 23 05:03:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e223 e223: 6 total, 6 up, 6 in Feb 23 05:03:01 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e223: 6 total, 6 up, 6 in Feb 23 05:03:01 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:01.614 2 INFO neutron.agent.securitygroups_rpc [None req-b3584fe0-66dc-4b0f-aa46-a4b6d8c2c0ea a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['26548beb-0e57-409b-96fc-150c1ca0653f']#033[00m Feb 23 05:03:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 23 05:03:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:03:01 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:03:01 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:01.817 2 INFO neutron.agent.securitygroups_rpc [None req-f7ece8f8-128a-45ae-8798-1ff174ac4586 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['26548beb-0e57-409b-96fc-150c1ca0653f']#033[00m Feb 23 05:03:02 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:03:02 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:02 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:02 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:03:02 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:02.440 2 INFO neutron.agent.securitygroups_rpc [None req-0700196d-2b57-4e61-9046-6185a324d5af a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e223 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:02 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:02.574 2 INFO neutron.agent.securitygroups_rpc [None req-26370991-3c47-4f17-8c42-2e91914f0a91 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:02 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:02.718 2 INFO neutron.agent.securitygroups_rpc [None req-0212ca46-343b-4ade-95ab-95bdce90a790 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:03:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:03:02 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:02.845 2 INFO neutron.agent.securitygroups_rpc [None req-bb2604e0-3280-4b44-93fb-d763a31d90c5 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:02 localhost podman[321453]: 2026-02-23 10:03:02.921559599 +0000 UTC m=+0.084287298 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 05:03:02 localhost podman[321452]: 2026-02-23 10:03:02.964731274 +0000 UTC m=+0.130052044 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.expose-services=) Feb 23 05:03:02 localhost podman[321452]: 2026-02-23 10:03:02.982208783 +0000 UTC m=+0.147529583 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=) Feb 23 05:03:02 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:02.997 2 INFO neutron.agent.securitygroups_rpc [None req-c9497ba8-e7cb-4ffb-8fd8-5a9e080024dc a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:02 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:03:03 localhost podman[321453]: 2026-02-23 10:03:03.035732174 +0000 UTC m=+0.198459913 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:03:03 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:03:03 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:03.120 2 INFO neutron.agent.securitygroups_rpc [None req-d0ec72bc-8566-4325-bbab-9dfaee70a365 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:03 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:03.443 2 INFO neutron.agent.securitygroups_rpc [None req-ddb1651b-3ac2-4a8f-8977-00c142537a52 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:03:03 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2123620571' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:03:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:03:03 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2123620571' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:03:03 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:03.600 2 INFO neutron.agent.securitygroups_rpc [None req-907a0d0b-deea-47b0-a1db-b229494a62fe a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:03 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:03.781 2 INFO neutron.agent.securitygroups_rpc [None req-bbab3356-bbf6-432a-a4a9-3be3927ef59f a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:03 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:03.960 2 INFO neutron.agent.securitygroups_rpc [None req-a3d9205c-35b8-4d92-aa27-e429ab8dbfe2 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['90bc54da-a8e2-4a34-bb37-4fcf0d4c07c4']#033[00m Feb 23 05:03:03 localhost nova_compute[282206]: 2026-02-23 10:03:03.979 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e223 do_prune osdmap full prune enabled Feb 23 05:03:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e224 e224: 6 total, 6 up, 6 in Feb 23 05:03:04 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e224: 6 total, 6 up, 6 in Feb 23 05:03:04 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:04.636 2 INFO neutron.agent.securitygroups_rpc [None req-daa2f3e5-708b-419e-b743-0c985d38df1b a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['087f2ead-29df-46bd-b356-e193c3a6c3a1']#033[00m Feb 23 05:03:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:03:05 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3379545140' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:03:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:03:05 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3379545140' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:03:05 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:05.781 2 INFO neutron.agent.securitygroups_rpc [None req-931492cc-c107-4f00-af23-020a7b1bfb4a a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['33c4ddfa-59ae-40a7-8c2f-cf1ffe09eb9f']#033[00m Feb 23 05:03:05 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:05.920 2 INFO neutron.agent.securitygroups_rpc [None req-8d452d14-dd60-417e-87bd-a7a2372759de a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['33c4ddfa-59ae-40a7-8c2f-cf1ffe09eb9f']#033[00m Feb 23 05:03:06 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:06.751 2 INFO neutron.agent.securitygroups_rpc [None req-43388687-9a00-4563-af08-790772899ea4 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['e9d6e743-53d6-4e9c-950f-ebadc1a82c0f']#033[00m Feb 23 05:03:06 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:06.871 2 INFO neutron.agent.securitygroups_rpc [None req-8bc11e13-0374-4d0f-a420-dc9bed7775c4 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['e9d6e743-53d6-4e9c-950f-ebadc1a82c0f']#033[00m Feb 23 05:03:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e224 do_prune osdmap full prune enabled Feb 23 05:03:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e225 e225: 6 total, 6 up, 6 in Feb 23 05:03:07 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e225: 6 total, 6 up, 6 in Feb 23 05:03:07 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:07.697 2 INFO neutron.agent.securitygroups_rpc [None req-7032d099-2d57-4911-980c-bd2a477e3e37 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']#033[00m Feb 23 05:03:07 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:07.905 2 INFO neutron.agent.securitygroups_rpc [None req-09a1ef76-51b7-437b-b37e-bb449ab05579 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']#033[00m Feb 23 05:03:08 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:08.079 2 INFO neutron.agent.securitygroups_rpc [None req-6957b0df-fd0d-456d-b289-5c8ae6b2d3ab a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']#033[00m Feb 23 05:03:08 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:08.358 2 INFO neutron.agent.securitygroups_rpc [None req-2e64ea36-7ced-48d5-9b78-d6c2b3b8afa9 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']#033[00m Feb 23 05:03:08 localhost ovn_controller[157695]: 2026-02-23T10:03:08Z|00359|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:03:08 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:08.645 2 INFO neutron.agent.securitygroups_rpc [None req-ac68416c-3f8b-4a61-93f4-c722d3303f27 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']#033[00m Feb 23 05:03:08 localhost nova_compute[282206]: 2026-02-23 10:03:08.655 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:08 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:08.908 2 INFO neutron.agent.securitygroups_rpc [None req-b0017ab9-a450-4286-8fae-6ccd40fd4966 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['b2b511b6-235a-4475-b039-adf8e8bf337f']#033[00m Feb 23 05:03:08 localhost nova_compute[282206]: 2026-02-23 10:03:08.982 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:09 localhost podman[242954]: time="2026-02-23T10:03:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:03:09 localhost podman[242954]: @ - - [23/Feb/2026:10:03:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1" Feb 23 05:03:09 localhost podman[242954]: @ - - [23/Feb/2026:10:03:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19298 "" "Go-http-client/1.1" Feb 23 05:03:09 localhost neutron_sriov_agent[258207]: 2026-02-23 10:03:09.745 2 INFO neutron.agent.securitygroups_rpc [None req-b5ba1047-11d6-4902-ae19-bf3c18cfb931 a9be4932f1a84a8293065e9227797a47 d45d0b9da54741348d1d12c73041586e - - default default] Security group rule updated ['9ad178d0-3a41-40dd-be58-0e7ebb53d59d']#033[00m Feb 23 05:03:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:03:10 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:10 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:10 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:03:10 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:10 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:10 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:10 localhost podman[321510]: 2026-02-23 10:03:10.44413568 +0000 UTC m=+0.060759320 container kill 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:03:10 localhost dnsmasq[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/addn_hosts - 0 addresses Feb 23 05:03:10 localhost dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/host Feb 23 05:03:10 localhost dnsmasq-dhcp[321373]: read /var/lib/neutron/dhcp/255d578f-65f8-4643-b21d-1ec8d68e886d/opts Feb 23 05:03:10 localhost nova_compute[282206]: 2026-02-23 10:03:10.696 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:10 localhost kernel: device tap76d8ca35-ba left promiscuous mode Feb 23 05:03:10 localhost ovn_controller[157695]: 2026-02-23T10:03:10Z|00360|binding|INFO|Releasing lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d from this chassis (sb_readonly=0) Feb 23 05:03:10 localhost ovn_controller[157695]: 2026-02-23T10:03:10Z|00361|binding|INFO|Setting lport 76d8ca35-bae5-4d5e-827a-3f91e4067a0d down in Southbound Feb 23 05:03:10 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:10.706 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-255d578f-65f8-4643-b21d-1ec8d68e886d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-255d578f-65f8-4643-b21d-1ec8d68e886d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a24474b213514491beaa97b54bfd695f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=40e0ff5c-1777-4eef-b7ef-c7cac136f27e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=76d8ca35-bae5-4d5e-827a-3f91e4067a0d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:03:10 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:10.708 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 76d8ca35-bae5-4d5e-827a-3f91e4067a0d in datapath 255d578f-65f8-4643-b21d-1ec8d68e886d unbound from our chassis#033[00m Feb 23 05:03:10 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:10.710 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 255d578f-65f8-4643-b21d-1ec8d68e886d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 05:03:10 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:10.711 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[59af9dfb-28b9-4bda-80b5-f67fee184575]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:03:10 localhost nova_compute[282206]: 2026-02-23 10:03:10.723 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 23 05:03:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:03:11 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:03:12 localhost ovn_controller[157695]: 2026-02-23T10:03:12Z|00362|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:03:12 localhost nova_compute[282206]: 2026-02-23 10:03:12.268 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:12 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:12 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:03:12 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:03:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e225 do_prune osdmap full prune enabled Feb 23 05:03:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e226 e226: 6 total, 6 up, 6 in Feb 23 05:03:12 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e226: 6 total, 6 up, 6 in Feb 23 05:03:12 localhost podman[321551]: 2026-02-23 10:03:12.593405359 +0000 UTC m=+0.066357004 container kill 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:03:12 localhost dnsmasq[321373]: exiting on receipt of SIGTERM Feb 23 05:03:12 localhost systemd[1]: libpod-36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2.scope: Deactivated successfully. Feb 23 05:03:12 localhost podman[321564]: 2026-02-23 10:03:12.67971706 +0000 UTC m=+0.066352085 container died 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 05:03:12 localhost podman[321564]: 2026-02-23 10:03:12.714853004 +0000 UTC m=+0.101487989 container cleanup 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 05:03:12 localhost systemd[1]: libpod-conmon-36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2.scope: Deactivated successfully. Feb 23 05:03:12 localhost podman[321565]: 2026-02-23 10:03:12.757207774 +0000 UTC m=+0.143194748 container remove 36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-255d578f-65f8-4643-b21d-1ec8d68e886d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:03:12 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:03:12.791 265541 INFO neutron.agent.dhcp.agent [None req-20033a4e-e604-41c3-bc6e-9c974713406a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:03:12 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:03:12.795 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:03:13 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Feb 23 05:03:13 localhost openstack_network_exporter[245358]: ERROR 10:03:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:03:13 localhost openstack_network_exporter[245358]: Feb 23 05:03:13 localhost openstack_network_exporter[245358]: ERROR 10:03:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:03:13 localhost openstack_network_exporter[245358]: Feb 23 05:03:13 localhost systemd[1]: var-lib-containers-storage-overlay-da48425f8f699e12a206073870f9c35f47d4c0c8337a2b7589ceef2e3b973683-merged.mount: Deactivated successfully. Feb 23 05:03:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36b444cefab4e0d0a3395ea11015581ac534c601549b4a8ad5ca75745502a0d2-userdata-shm.mount: Deactivated successfully. Feb 23 05:03:13 localhost systemd[1]: run-netns-qdhcp\x2d255d578f\x2d65f8\x2d4643\x2db21d\x2d1ec8d68e886d.mount: Deactivated successfully. Feb 23 05:03:13 localhost nova_compute[282206]: 2026-02-23 10:03:13.985 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:03:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:15 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:15 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:03:15 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e226 do_prune osdmap full prune enabled Feb 23 05:03:15 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e227 e227: 6 total, 6 up, 6 in Feb 23 05:03:15 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e227: 6 total, 6 up, 6 in Feb 23 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:03:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:03:16 localhost systemd[1]: tmp-crun.AKQFTc.mount: Deactivated successfully. Feb 23 05:03:16 localhost podman[321592]: 2026-02-23 10:03:16.947218283 +0000 UTC m=+0.119123102 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Feb 23 05:03:17 localhost podman[321593]: 2026-02-23 10:03:17.015175938 +0000 UTC m=+0.183767863 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:03:17 localhost podman[321593]: 2026-02-23 10:03:17.028623689 +0000 UTC m=+0.197215664 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:03:17 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:03:17 localhost podman[321592]: 2026-02-23 10:03:17.054690609 +0000 UTC m=+0.226595438 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260216, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 05:03:17 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:03:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:03:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3710238268' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:03:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:03:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3710238268' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:03:17 localhost systemd[1]: tmp-crun.P35pO1.mount: Deactivated successfully. Feb 23 05:03:18 localhost nova_compute[282206]: 2026-02-23 10:03:18.987 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:03:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e227 do_prune osdmap full prune enabled Feb 23 05:03:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e228 e228: 6 total, 6 up, 6 in Feb 23 05:03:21 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e228: 6 total, 6 up, 6 in Feb 23 05:03:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e228 do_prune osdmap full prune enabled Feb 23 05:03:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e229 e229: 6 total, 6 up, 6 in Feb 23 05:03:22 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e229: 6 total, 6 up, 6 in Feb 23 05:03:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:03:22 localhost podman[321641]: 2026-02-23 10:03:22.908382241 +0000 UTC m=+0.085545618 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Feb 23 05:03:22 localhost podman[321641]: 2026-02-23 10:03:22.921991728 +0000 UTC m=+0.099155135 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible) Feb 23 05:03:22 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:03:23 localhost podman[321661]: 2026-02-23 10:03:23.915293914 +0000 UTC m=+0.083693940 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent) Feb 23 05:03:23 localhost podman[321661]: 2026-02-23 10:03:23.949175398 +0000 UTC m=+0.117575424 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:03:23 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:03:23 localhost nova_compute[282206]: 2026-02-23 10:03:23.991 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:03:23 localhost nova_compute[282206]: 2026-02-23 10:03:23.993 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:23 localhost nova_compute[282206]: 2026-02-23 10:03:23.994 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:03:23 localhost nova_compute[282206]: 2026-02-23 10:03:23.994 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:03:23 localhost nova_compute[282206]: 2026-02-23 10:03:23.994 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:03:23 localhost nova_compute[282206]: 2026-02-23 10:03:23.997 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:03:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/164380082' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:03:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:03:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/164380082' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:03:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:25 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:25 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 23 05:03:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:25 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:03:25 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.115 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.115 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.116 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.116 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:03:26 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:03:26 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:26 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:03:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e229 do_prune osdmap full prune enabled Feb 23 05:03:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e230 e230: 6 total, 6 up, 6 in Feb 23 05:03:26 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e230: 6 total, 6 up, 6 in Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.807 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.824 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.824 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.825 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:26 localhost nova_compute[282206]: 2026-02-23 10:03:26.825 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:03:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:03:27 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:27 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:03:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2068515464' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:03:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:03:28 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2068515464' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:03:28 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:03:28 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:28 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:28 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:28 localhost nova_compute[282206]: 2026-02-23 10:03:28.995 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain.devices.0}] v 0) Feb 23 05:03:29 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626465.localdomain}] v 0) Feb 23 05:03:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain.devices.0}] v 0) Feb 23 05:03:29 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:29 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain.devices.0}] v 0) Feb 23 05:03:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626466.localdomain}] v 0) Feb 23 05:03:29 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:29 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005626463.localdomain}] v 0) Feb 23 05:03:29 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:03:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e230 do_prune osdmap full prune enabled Feb 23 05:03:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e231 e231: 6 total, 6 up, 6 in Feb 23 05:03:30 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e231: 6 total, 6 up, 6 in Feb 23 05:03:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:03:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:31 localhost nova_compute[282206]: 2026-02-23 10:03:31.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 23 05:03:31 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:03:31 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:03:31 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:03:31 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:03:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:03:31 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:03:31 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:03:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:03:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:03:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:31 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:32 localhost nova_compute[282206]: 2026-02-23 10:03:32.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:32 localhost nova_compute[282206]: 2026-02-23 10:03:32.284 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:03:32 localhost nova_compute[282206]: 2026-02-23 10:03:32.284 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:03:32 localhost nova_compute[282206]: 2026-02-23 10:03:32.285 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:03:32 localhost nova_compute[282206]: 2026-02-23 10:03:32.285 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:03:32 localhost nova_compute[282206]: 2026-02-23 10:03:32.285 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:03:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:03:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4145762043' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:03:32 localhost nova_compute[282206]: 2026-02-23 10:03:32.739 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:03:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:03:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:03:33 localhost podman[321845]: 2026-02-23 10:03:33.947380902 +0000 UTC m=+0.092641790 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, release=1770267347, com.redhat.component=ubi9-minimal-container, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 05:03:33 localhost podman[321845]: 2026-02-23 10:03:33.992453427 +0000 UTC m=+0.137714235 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1770267347, config_id=openstack_network_exporter, container_name=openstack_network_exporter) Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:33.999 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:34 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:03:34 localhost podman[321846]: 2026-02-23 10:03:33.99605035 +0000 UTC m=+0.139055907 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 05:03:34 localhost podman[321846]: 2026-02-23 10:03:34.078274563 +0000 UTC m=+0.221280080 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 05:03:34 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.230 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.230 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.439 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.440 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11255MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.440 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.440 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.562 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.563 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.563 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:03:34 localhost nova_compute[282206]: 2026-02-23 10:03:34.630 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:03:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:03:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:03:35 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4272332485' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:03:35 localhost nova_compute[282206]: 2026-02-23 10:03:35.116 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:03:35 localhost nova_compute[282206]: 2026-02-23 10:03:35.123 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:03:35 localhost nova_compute[282206]: 2026-02-23 10:03:35.145 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:03:35 localhost nova_compute[282206]: 2026-02-23 10:03:35.147 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:03:35 localhost nova_compute[282206]: 2026-02-23 10:03:35.148 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:03:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:35 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:35 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:03:35 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:35 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:35 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:36 localhost nova_compute[282206]: 2026-02-23 10:03:36.144 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:36 localhost nova_compute[282206]: 2026-02-23 10:03:36.145 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:36 localhost nova_compute[282206]: 2026-02-23 10:03:36.173 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:36 localhost nova_compute[282206]: 2026-02-23 10:03:36.173 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:36 localhost nova_compute[282206]: 2026-02-23 10:03:36.174 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e231 do_prune osdmap full prune enabled Feb 23 05:03:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e232 e232: 6 total, 6 up, 6 in Feb 23 05:03:37 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e232: 6 total, 6 up, 6 in Feb 23 05:03:38 localhost nova_compute[282206]: 2026-02-23 10:03:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:03:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 23 05:03:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:03:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:03:38 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:03:38 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:03:38 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:03:38 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:03:38 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:03:39 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:39.000 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:03:39 localhost nova_compute[282206]: 2026-02-23 10:03:39.000 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:39 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:39.003 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:03:39 localhost nova_compute[282206]: 2026-02-23 10:03:39.003 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e232 do_prune osdmap full prune enabled Feb 23 05:03:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e233 e233: 6 total, 6 up, 6 in Feb 23 05:03:39 localhost podman[242954]: time="2026-02-23T10:03:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:03:39 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e233: 6 total, 6 up, 6 in Feb 23 05:03:39 localhost podman[242954]: @ - - [23/Feb/2026:10:03:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:03:39 localhost podman[242954]: @ - - [23/Feb/2026:10:03:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18833 "" "Go-http-client/1.1" Feb 23 05:03:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e233 do_prune osdmap full prune enabled Feb 23 05:03:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e234 e234: 6 total, 6 up, 6 in Feb 23 05:03:40 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e234: 6 total, 6 up, 6 in Feb 23 05:03:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:41 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:03:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:41 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Feb 23 05:03:42 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:42.008 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:03:42 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:03:42 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:42 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:42 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e234 do_prune osdmap full prune enabled Feb 23 05:03:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e235 e235: 6 total, 6 up, 6 in Feb 23 05:03:42 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e235: 6 total, 6 up, 6 in Feb 23 05:03:43 localhost ovn_controller[157695]: 2026-02-23T10:03:43Z|00363|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 23 05:03:43 localhost openstack_network_exporter[245358]: ERROR 10:03:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:03:43 localhost openstack_network_exporter[245358]: Feb 23 05:03:43 localhost openstack_network_exporter[245358]: ERROR 10:03:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:03:43 localhost openstack_network_exporter[245358]: Feb 23 05:03:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:43 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e235 do_prune osdmap full prune enabled Feb 23 05:03:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e236 e236: 6 total, 6 up, 6 in Feb 23 05:03:43 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e236: 6 total, 6 up, 6 in Feb 23 05:03:44 localhost nova_compute[282206]: 2026-02-23 10:03:44.006 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:44 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:45 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 23 05:03:45 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:45 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:03:45 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:03:45 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:03:45 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:45 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:45 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:03:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e236 do_prune osdmap full prune enabled Feb 23 05:03:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e237 e237: 6 total, 6 up, 6 in Feb 23 05:03:47 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e237: 6 total, 6 up, 6 in Feb 23 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:03:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:03:47 localhost podman[321912]: 2026-02-23 10:03:47.924451707 +0000 UTC m=+0.093763409 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller) Feb 23 05:03:47 localhost podman[321913]: 2026-02-23 10:03:47.975263918 +0000 UTC m=+0.141614349 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:03:47 localhost podman[321912]: 2026-02-23 10:03:47.990361575 +0000 UTC m=+0.159673277 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller) Feb 23 05:03:48 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:03:48 localhost podman[321913]: 2026-02-23 10:03:48.012437268 +0000 UTC m=+0.178787719 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:03:48 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:03:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:03:48 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:48 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:48.563 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:03:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:48.563 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:03:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:03:48.564 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:03:49 localhost nova_compute[282206]: 2026-02-23 10:03:49.008 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:49 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:03:49 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:49 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:49 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e237 do_prune osdmap full prune enabled Feb 23 05:03:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e238 e238: 6 total, 6 up, 6 in Feb 23 05:03:50 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e238: 6 total, 6 up, 6 in Feb 23 05:03:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:50 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:50 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:51 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e238 do_prune osdmap full prune enabled Feb 23 05:03:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e239 e239: 6 total, 6 up, 6 in Feb 23 05:03:52 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e239: 6 total, 6 up, 6 in Feb 23 05:03:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 23 05:03:52 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e239 do_prune osdmap full prune enabled Feb 23 05:03:52 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:03:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e240 e240: 6 total, 6 up, 6 in Feb 23 05:03:52 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e240: 6 total, 6 up, 6 in Feb 23 05:03:52 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:03:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:52 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:53 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:03:53 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:53 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:03:53 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:03:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:03:53 localhost podman[321956]: 2026-02-23 10:03:53.92010622 +0000 UTC m=+0.091792698 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute) Feb 23 05:03:53 localhost podman[321956]: 2026-02-23 10:03:53.934307499 +0000 UTC m=+0.105994007 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Feb 23 05:03:53 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:03:54 localhost nova_compute[282206]: 2026-02-23 10:03:54.012 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:03:54 localhost podman[321975]: 2026-02-23 10:03:54.922076827 +0000 UTC m=+0.088899060 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2) Feb 23 05:03:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:03:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:54 localhost podman[321975]: 2026-02-23 10:03:54.96131138 +0000 UTC m=+0.128133553 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 05:03:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:54 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:03:55 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:03:55 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:03:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:03:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:03:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 23 05:03:58 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:58 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:03:58 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:03:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:03:58 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:03:59 localhost nova_compute[282206]: 2026-02-23 10:03:59.016 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:03:59 localhost nova_compute[282206]: 2026-02-23 10:03:59.018 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:03:59 localhost nova_compute[282206]: 2026-02-23 10:03:59.019 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:03:59 localhost nova_compute[282206]: 2026-02-23 10:03:59.019 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:03:59 localhost nova_compute[282206]: 2026-02-23 10:03:59.041 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:03:59 localhost nova_compute[282206]: 2026-02-23 10:03:59.042 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:03:59 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:03:59 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:59 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:03:59 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:04:01 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:01 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:02 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:04:02 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:02 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:02 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e240 do_prune osdmap full prune enabled Feb 23 05:04:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e241 e241: 6 total, 6 up, 6 in Feb 23 05:04:02 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e241: 6 total, 6 up, 6 in Feb 23 05:04:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e241 do_prune osdmap full prune enabled Feb 23 05:04:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e242 e242: 6 total, 6 up, 6 in Feb 23 05:04:03 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e242: 6 total, 6 up, 6 in Feb 23 05:04:03 localhost sshd[321994]: main: sshd: ssh-rsa algorithm is disabled Feb 23 05:04:04 localhost nova_compute[282206]: 2026-02-23 10:04:04.043 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:04 localhost nova_compute[282206]: 2026-02-23 10:04:04.046 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:04 localhost nova_compute[282206]: 2026-02-23 10:04:04.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:04 localhost nova_compute[282206]: 2026-02-23 10:04:04.047 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:04 localhost nova_compute[282206]: 2026-02-23 10:04:04.086 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:04 localhost nova_compute[282206]: 2026-02-23 10:04:04.087 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0. Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.151637) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61 Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044151767, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2661, "num_deletes": 265, "total_data_size": 2833728, "memory_usage": 2921888, "flush_reason": "Manual Compaction"} Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044172227, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 2771005, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33119, "largest_seqno": 35779, "table_properties": {"data_size": 2759023, "index_size": 7591, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3333, "raw_key_size": 30038, "raw_average_key_size": 22, "raw_value_size": 2733367, "raw_average_value_size": 2077, "num_data_blocks": 318, "num_entries": 1316, "num_filter_entries": 1316, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771840925, "oldest_key_time": 1771840925, "file_creation_time": 1771841044, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}} Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 20638 microseconds, and 8952 cpu microseconds. Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.172291) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 2771005 bytes OK Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.172323) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.174304) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.174325) EVENT_LOG_v1 {"time_micros": 1771841044174319, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.174360) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 2821520, prev total WAL file size 2821520, number of live WAL files 2. Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.175291) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end) Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(2706KB)], [60(16MB)] Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044175346, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 20160651, "oldest_snapshot_seqno": -1} Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 14031 keys, 18879027 bytes, temperature: kUnknown Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044291744, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 18879027, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18796651, "index_size": 46179, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35141, "raw_key_size": 374462, "raw_average_key_size": 26, "raw_value_size": 18555953, "raw_average_value_size": 1322, "num_data_blocks": 1751, "num_entries": 14031, "num_filter_entries": 14031, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841044, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}} Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.292195) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 18879027 bytes Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.294064) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.0 rd, 162.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 16.6 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(14.1) write-amplify(6.8) OK, records in: 14584, records dropped: 553 output_compression: NoCompression Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.294094) EVENT_LOG_v1 {"time_micros": 1771841044294080, "job": 36, "event": "compaction_finished", "compaction_time_micros": 116545, "compaction_time_cpu_micros": 54004, "output_level": 6, "num_output_files": 1, "total_output_size": 18879027, "num_input_records": 14584, "num_output_records": 14031, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044294687, "job": 36, "event": "table_file_deletion", "file_number": 62} Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841044297720, "job": 36, "event": "table_file_deletion", "file_number": 60} Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.175194) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298269) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:04 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:04.298274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:04:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:04:04 localhost systemd[1]: tmp-crun.pMVI1n.mount: Deactivated successfully. Feb 23 05:04:04 localhost podman[321996]: 2026-02-23 10:04:04.939361666 +0000 UTC m=+0.104976987 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 05:04:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 23 05:04:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:04:04 localhost podman[321995]: 2026-02-23 10:04:04.996075729 +0000 UTC m=+0.165702034 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., release=1770267347, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers) Feb 23 05:04:05 localhost podman[321996]: 2026-02-23 10:04:05.002510197 +0000 UTC m=+0.168125498 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 05:04:05 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:04:05 localhost podman[321995]: 2026-02-23 10:04:05.040383409 +0000 UTC m=+0.210009694 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.7, io.buildah.version=1.33.7, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, release=1770267347, managed_by=edpm_ansible, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 05:04:05 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:04:05 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:05 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:04:05 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:05 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:05 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:04:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:05 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:06 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e49: np0005626465.hlpkwo(active, since 13m), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 05:04:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:08 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:09 localhost nova_compute[282206]: 2026-02-23 10:04:09.088 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:09 localhost nova_compute[282206]: 2026-02-23 10:04:09.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:09 localhost nova_compute[282206]: 2026-02-23 10:04:09.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:09 localhost nova_compute[282206]: 2026-02-23 10:04:09.089 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:09 localhost nova_compute[282206]: 2026-02-23 10:04:09.121 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:09 localhost nova_compute[282206]: 2026-02-23 10:04:09.122 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:04:09 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:09 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:09 localhost podman[242954]: time="2026-02-23T10:04:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:04:09 localhost podman[242954]: @ - - [23/Feb/2026:10:04:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:04:09 localhost podman[242954]: @ - - [23/Feb/2026:10:04:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18836 "" "Go-http-client/1.1" Feb 23 05:04:11 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 23 05:04:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:11 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:04:11 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:12 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:04:12 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:12 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:04:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e242 do_prune osdmap full prune enabled Feb 23 05:04:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 e243: 6 total, 6 up, 6 in Feb 23 05:04:12 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e243: 6 total, 6 up, 6 in Feb 23 05:04:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:13 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:13 localhost openstack_network_exporter[245358]: ERROR 10:04:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:04:13 localhost openstack_network_exporter[245358]: Feb 23 05:04:13 localhost openstack_network_exporter[245358]: ERROR 10:04:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:04:13 localhost openstack_network_exporter[245358]: Feb 23 05:04:14 localhost nova_compute[282206]: 2026-02-23 10:04:14.123 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:14 localhost nova_compute[282206]: 2026-02-23 10:04:14.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:14 localhost nova_compute[282206]: 2026-02-23 10:04:14.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:14 localhost nova_compute[282206]: 2026-02-23 10:04:14.125 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:14 localhost nova_compute[282206]: 2026-02-23 10:04:14.167 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:14 localhost nova_compute[282206]: 2026-02-23 10:04:14.168 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:15 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:04:15 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:15 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:15 localhost nova_compute[282206]: 2026-02-23 10:04:15.966 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:16 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 23 05:04:18 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:18 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:04:18 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:18 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:18 localhost nova_compute[282206]: 2026-02-23 10:04:18.808 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:18 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:04:18 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:18 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:18 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:04:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:04:18 localhost podman[322039]: 2026-02-23 10:04:18.92714154 +0000 UTC m=+0.086517505 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Feb 23 05:04:18 localhost podman[322040]: 2026-02-23 10:04:18.983190613 +0000 UTC m=+0.137896654 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:04:18 localhost podman[322040]: 2026-02-23 10:04:18.992402988 +0000 UTC m=+0.147109019 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:04:19 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:04:19 localhost podman[322039]: 2026-02-23 10:04:19.046738538 +0000 UTC m=+0.206114493 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible) Feb 23 05:04:19 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:04:19 localhost nova_compute[282206]: 2026-02-23 10:04:19.168 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:19 localhost nova_compute[282206]: 2026-02-23 10:04:19.171 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:19 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:19 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:19 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:19 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:21 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:21 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:21 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:21 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:04:21 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:23 localhost ovn_controller[157695]: 2026-02-23T10:04:23Z|00364|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:04:23 localhost nova_compute[282206]: 2026-02-23 10:04:23.623 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:24 localhost nova_compute[282206]: 2026-02-23 10:04:24.170 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:24 localhost nova_compute[282206]: 2026-02-23 10:04:24.173 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 23 05:04:24 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:04:24 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:04:24 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:04:24 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:04:24 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:04:24 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:04:24 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:04:24 localhost podman[322085]: 2026-02-23 10:04:24.919975993 +0000 UTC m=+0.092340455 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 05:04:24 localhost podman[322085]: 2026-02-23 10:04:24.962350303 +0000 UTC m=+0.134714755 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute) Feb 23 05:04:24 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:04:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:04:25 localhost podman[322104]: 2026-02-23 10:04:25.104829588 +0000 UTC m=+0.097356740 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Feb 23 05:04:25 localhost podman[322104]: 2026-02-23 10:04:25.141378398 +0000 UTC m=+0.133905510 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260216, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:04:25 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:04:26 localhost nova_compute[282206]: 2026-02-23 10:04:26.056 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:26 localhost nova_compute[282206]: 2026-02-23 10:04:26.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:04:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:26 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.057 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.057 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.058 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.157 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.159 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.159 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.159 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:04:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:27 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.886 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:04:27 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.901 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:04:27 localhost nova_compute[282206]: 2026-02-23 10:04:27.902 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:04:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:04:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:28 localhost nova_compute[282206]: 2026-02-23 10:04:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:28 localhost nova_compute[282206]: 2026-02-23 10:04:28.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Feb 23 05:04:29 localhost nova_compute[282206]: 2026-02-23 10:04:29.174 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:29 localhost nova_compute[282206]: 2026-02-23 10:04:29.176 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:29 localhost nova_compute[282206]: 2026-02-23 10:04:29.177 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:29 localhost nova_compute[282206]: 2026-02-23 10:04:29.178 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:29 localhost nova_compute[282206]: 2026-02-23 10:04:29.213 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:29 localhost nova_compute[282206]: 2026-02-23 10:04:29.214 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:29 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:29 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:31 localhost nova_compute[282206]: 2026-02-23 10:04:31.068 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 23 05:04:31 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:04:31 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:04:31 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:04:31 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:04:32 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:04:32 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:04:32 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:04:32 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:04:32 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:04:32 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.075 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.076 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.076 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.076 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:04:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:04:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3464871891' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.527 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:04:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0. Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.584064) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64 Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072584168, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 788, "num_deletes": 261, "total_data_size": 743234, "memory_usage": 758088, "flush_reason": "Manual Compaction"} Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072596390, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 733430, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35780, "largest_seqno": 36567, "table_properties": {"data_size": 729463, "index_size": 1566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10567, "raw_average_key_size": 20, "raw_value_size": 720714, "raw_average_value_size": 1388, "num_data_blocks": 68, "num_entries": 519, "num_filter_entries": 519, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841044, "oldest_key_time": 1771841044, "file_creation_time": 1771841072, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}} Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 12373 microseconds, and 5312 cpu microseconds. Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.596 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.597 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.596449) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 733430 bytes OK Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.596480) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.598722) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.598744) EVENT_LOG_v1 {"time_micros": 1771841072598737, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.598776) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 738909, prev total WAL file size 739233, number of live WAL files 2. Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.599443) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323730' seq:72057594037927935, type:22 .. '6C6F676D0034353235' seq:0, type:0; will stop at (end) Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(716KB)], [63(18MB)] Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072599495, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 19612457, "oldest_snapshot_seqno": -1} Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 14002 keys, 19209445 bytes, temperature: kUnknown Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072721903, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 19209445, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19127151, "index_size": 46225, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35013, "raw_key_size": 375300, "raw_average_key_size": 26, "raw_value_size": 18886681, "raw_average_value_size": 1348, "num_data_blocks": 1745, "num_entries": 14002, "num_filter_entries": 14002, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841072, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}} Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.722185) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 19209445 bytes Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.724649) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.1 rd, 156.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 18.0 +0.0 blob) out(18.3 +0.0 blob), read-write-amplify(52.9) write-amplify(26.2) OK, records in: 14550, records dropped: 548 output_compression: NoCompression Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.724669) EVENT_LOG_v1 {"time_micros": 1771841072724660, "job": 38, "event": "compaction_finished", "compaction_time_micros": 122471, "compaction_time_cpu_micros": 52435, "output_level": 6, "num_output_files": 1, "total_output_size": 19209445, "num_input_records": 14550, "num_output_records": 14002, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072724900, "job": 38, "event": "table_file_deletion", "file_number": 65} Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841072726636, "job": 38, "event": "table_file_deletion", "file_number": 63} Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.599352) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726716) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726731) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:04:32.726735) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.848 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.850 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11245MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.851 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:04:32 localhost nova_compute[282206]: 2026-02-23 10:04:32.851 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:04:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:32 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.035 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.036 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.036 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.097 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing inventories for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.171 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating ProviderTree inventory for provider be63d86c-a403-4ec9-a515-07ea2962cb4d from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.172 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Updating inventory in ProviderTree for provider be63d86c-a403-4ec9-a515-07ea2962cb4d with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.187 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing aggregate associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.220 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Refreshing trait associations for resource provider be63d86c-a403-4ec9-a515-07ea2962cb4d, traits: HW_CPU_X86_AVX2,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_MMX,HW_CPU_X86_SSE41,HW_CPU_X86_SSE2,HW_CPU_X86_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_RESCUE_BFV,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_CLMUL,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_ABM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NODE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE4A,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.258 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:04:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:04:33 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1770454627' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.748 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.756 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.777 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.779 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:04:33 localhost nova_compute[282206]: 2026-02-23 10:04:33.780 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:04:34 localhost nova_compute[282206]: 2026-02-23 10:04:34.215 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:34 localhost nova_compute[282206]: 2026-02-23 10:04:34.218 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:34 localhost nova_compute[282206]: 2026-02-23 10:04:34.218 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:34 localhost nova_compute[282206]: 2026-02-23 10:04:34.219 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:34 localhost nova_compute[282206]: 2026-02-23 10:04:34.242 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:34 localhost nova_compute[282206]: 2026-02-23 10:04:34.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:34 localhost nova_compute[282206]: 2026-02-23 10:04:34.776 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:34 localhost nova_compute[282206]: 2026-02-23 10:04:34.777 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:35 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:04:35 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:35 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:35 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:04:35 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:04:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:04:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:04:35 localhost podman[322252]: 2026-02-23 10:04:35.92514173 +0000 UTC m=+0.091923663 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:04:35 localhost podman[322252]: 2026-02-23 10:04:35.939333649 +0000 UTC m=+0.106115562 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:04:35 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:04:36 localhost podman[322251]: 2026-02-23 10:04:35.999625762 +0000 UTC m=+0.169044276 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, version=9.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Feb 23 05:04:36 localhost podman[322251]: 2026-02-23 10:04:36.012309215 +0000 UTC m=+0.181727689 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter) Feb 23 05:04:36 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:04:36 localhost nova_compute[282206]: 2026-02-23 10:04:36.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:36 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:04:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:37 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:38 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e50: np0005626465.hlpkwo(active, since 13m), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 05:04:38 localhost nova_compute[282206]: 2026-02-23 10:04:38.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:38 localhost nova_compute[282206]: 2026-02-23 10:04:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 23 05:04:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:38 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:04:38 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:38 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:04:38 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:38 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:38 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:04:39 localhost nova_compute[282206]: 2026-02-23 10:04:39.244 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:39 localhost nova_compute[282206]: 2026-02-23 10:04:39.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:39 localhost nova_compute[282206]: 2026-02-23 10:04:39.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:39 localhost nova_compute[282206]: 2026-02-23 10:04:39.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:39 localhost nova_compute[282206]: 2026-02-23 10:04:39.269 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:39 localhost nova_compute[282206]: 2026-02-23 10:04:39.270 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:39 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:39 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:39 localhost podman[242954]: time="2026-02-23T10:04:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:04:39 localhost podman[242954]: @ - - [23/Feb/2026:10:04:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:04:39 localhost podman[242954]: @ - - [23/Feb/2026:10:04:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18831 "" "Go-http-client/1.1" Feb 23 05:04:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:40 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:41 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:41 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:41 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:42 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:04:42 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:42 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:42 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:43 localhost openstack_network_exporter[245358]: ERROR 10:04:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:04:43 localhost openstack_network_exporter[245358]: Feb 23 05:04:43 localhost openstack_network_exporter[245358]: ERROR 10:04:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:04:43 localhost openstack_network_exporter[245358]: Feb 23 05:04:44 localhost nova_compute[282206]: 2026-02-23 10:04:44.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:44 localhost nova_compute[282206]: 2026-02-23 10:04:44.271 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:44 localhost nova_compute[282206]: 2026-02-23 10:04:44.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:44 localhost nova_compute[282206]: 2026-02-23 10:04:44.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:44 localhost nova_compute[282206]: 2026-02-23 10:04:44.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:44 localhost ovn_metadata_agent[163567]: 2026-02-23 10:04:44.797 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:04:44 localhost ovn_metadata_agent[163567]: 2026-02-23 10:04:44.797 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:04:44 localhost nova_compute[282206]: 2026-02-23 10:04:44.841 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:44 localhost nova_compute[282206]: 2026-02-23 10:04:44.842 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:44 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:45 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 23 05:04:45 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:45 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:04:45 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:45 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:04:45 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:45 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:45 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740", "osd", "allow rw pool=manila_data namespace=fsvolumens_f47619b3-d060-43cb-beb4-45b54645fdc0", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:45 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:04:45 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:45 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:04:45 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:04:46 localhost nova_compute[282206]: 2026-02-23 10:04:46.822 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:04:46 localhost nova_compute[282206]: 2026-02-23 10:04:46.823 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Feb 23 05:04:46 localhost nova_compute[282206]: 2026-02-23 10:04:46.845 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Feb 23 05:04:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0) Feb 23 05:04:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:04:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:04:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:04:48.564 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:04:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:04:48.564 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:04:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:04:48.565 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:04:48 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:04:48 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:04:48 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:04:48 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:04:48 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/f47619b3-d060-43cb-beb4-45b54645fdc0/84bd9b7f-b699-46ec-b8f3-fe3b99d4f740],prefix=session evict} (starting...) Feb 23 05:04:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:48 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:48 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:49 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:49 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:04:49 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:49 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:49 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:04:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:04:49 localhost nova_compute[282206]: 2026-02-23 10:04:49.843 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:49 localhost nova_compute[282206]: 2026-02-23 10:04:49.845 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:49 localhost nova_compute[282206]: 2026-02-23 10:04:49.845 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:49 localhost nova_compute[282206]: 2026-02-23 10:04:49.845 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:49 localhost nova_compute[282206]: 2026-02-23 10:04:49.874 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:49 localhost nova_compute[282206]: 2026-02-23 10:04:49.874 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:49 localhost podman[322296]: 2026-02-23 10:04:49.948565775 +0000 UTC m=+0.074580807 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Feb 23 05:04:49 localhost podman[322296]: 2026-02-23 10:04:49.956071666 +0000 UTC m=+0.082086708 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:04:49 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:04:50 localhost podman[322295]: 2026-02-23 10:04:50.055584743 +0000 UTC m=+0.216019759 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Feb 23 05:04:50 localhost podman[322295]: 2026-02-23 10:04:50.096343853 +0000 UTC m=+0.256778879 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Feb 23 05:04:50 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:04:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e243 do_prune osdmap full prune enabled Feb 23 05:04:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e244 e244: 6 total, 6 up, 6 in Feb 23 05:04:50 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e244: 6 total, 6 up, 6 in Feb 23 05:04:50 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:04:50 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:04:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 23 05:04:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:04:51 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:51 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:04:51 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:51 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:51 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:04:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:53 localhost ovn_metadata_agent[163567]: 2026-02-23 10:04:53.799 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:04:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:54 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:04:54 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:54 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:54 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592", "osd", "allow rw pool=manila_data namespace=fsvolumens_11555dd1-63b1-44b4-8930-21367a0b0414", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:54 localhost ovn_controller[157695]: 2026-02-23T10:04:54Z|00365|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 23 05:04:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:04:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:54 localhost nova_compute[282206]: 2026-02-23 10:04:54.876 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:54 localhost nova_compute[282206]: 2026-02-23 10:04:54.878 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:04:54 localhost nova_compute[282206]: 2026-02-23 10:04:54.878 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:04:54 localhost nova_compute[282206]: 2026-02-23 10:04:54.878 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:54 localhost nova_compute[282206]: 2026-02-23 10:04:54.911 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:54 localhost nova_compute[282206]: 2026-02-23 10:04:54.912 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:04:55 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:04:55 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:04:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:04:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:04:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:04:55 localhost podman[322344]: 2026-02-23 10:04:55.923980662 +0000 UTC m=+0.090279603 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Feb 23 05:04:55 localhost systemd[1]: tmp-crun.LWeMpO.mount: Deactivated successfully. Feb 23 05:04:55 localhost podman[322345]: 2026-02-23 10:04:55.987166464 +0000 UTC m=+0.143715353 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:04:56 localhost podman[322345]: 2026-02-23 10:04:56.001490977 +0000 UTC m=+0.158039966 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute) Feb 23 05:04:56 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:04:56 localhost podman[322344]: 2026-02-23 10:04:56.05883274 +0000 UTC m=+0.225131691 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:04:56 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.148 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.155 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b483bb33-2129-4e5d-acb0-9691f0f0636c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.149495', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ada6648-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'ad3873d0779aa0810f07dec5fb28355bace6915f3e5c9fa5dcbaf2d6503da93a'}]}, 'timestamp': '2026-02-23 10:04:56.156473', '_unique_id': '83384a36efdd4784a747e4ba8302584a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.159 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.175 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.176 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed91d370-934a-49c3-885d-dc529ec9fcc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.160508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1add7356-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': '7e3f0f92ef057ea5eec85b9edcaaad0899ff2971762fe62380bcc5bc9b8e9705'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.160508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1add8990-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': 'c3b5903503049f4ea064867878d06c3accfa981c4938fd3607f7cca27e09976c'}]}, 'timestamp': '2026-02-23 10:04:56.177010', '_unique_id': 'bc6b27d7fcc940c5bc7c8c34deb212ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.178 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.179 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de403f24-e15b-4f59-a6b3-b111aca7d265', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.179626', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ade0c3a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '1da2133420d5ef754424bfe042a2edf0b0c478885860a95f8f629476a1d7a64c'}]}, 'timestamp': '2026-02-23 10:04:56.180375', '_unique_id': '54e09f552326489bb11a1b4610aac8d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.181 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.218 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.219 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ecccfe2-9274-497d-93e0-cb6340ef745c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.183594', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae40202-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '859c44a1cf8e63742ecc079849337edd3e188b2552693953fe718c85d63013df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.183594', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae418c8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '5d55c874b6bdb07f290de634bec94923b1b8b57f73f730ca04f2e4b073375c49'}]}, 'timestamp': '2026-02-23 10:04:56.219934', '_unique_id': '20d16bb1bfe541e2b357eff593772cdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.223 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c5ddbee-9f2b-4a0c-a7d5-c96081934f57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.223776', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae4c494-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '5e16149ffe304f504afc8c18d31c3544ea0dd0053d81e715ce32a99aa53f7dee'}]}, 'timestamp': '2026-02-23 10:04:56.224301', '_unique_id': 'fa02983e8a744aa6b5cdb753b54d4f3e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.226 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5e51475-38d4-47f5-8920-765749077ffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.226691', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae5355a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'af723f0ea35d470832f92aafc35ec51a42d0734f9eb047a5d3950535722b3df0'}]}, 'timestamp': '2026-02-23 10:04:56.227188', '_unique_id': '1cf9d841bacc4a338c4e30fb904e39e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.230 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd7a6ea0-72ec-40d1-aaa9-e3a32bb47760', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.229834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae5aff8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': 'd38b5cb2394ec2c9b254c2639e46202676cf8bd5652d8c5eb28a431603261e93'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.229834', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae5c038-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '7584869702a5884c07d5b2359a0c70ea1fb41bb512201b050c83d2b875b9df73'}]}, 'timestamp': '2026-02-23 10:04:56.230709', '_unique_id': 'fe70f00ad73e4112a8b8916777db196a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.232 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '813e51fb-0fd0-4d44-a1cc-3e02453c9337', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.232972', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae62a00-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'b57734b39e6f764a844628d91e778233b52a0a692e26b9b80e21a2edcaa51af6'}]}, 'timestamp': '2026-02-23 10:04:56.233446', '_unique_id': 'f90fdb96f13b4ec6b0eae23e970b6aab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.234 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.235 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71887e8b-e2e4-4c14-bf30-035e3dacb0a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.235576', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae68edc-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '014544f5d8c790a2674ac065fa892c66ae8934b869d023a63e0521220e5005a6'}]}, 'timestamp': '2026-02-23 10:04:56.236067', '_unique_id': '8ae4d8629c36460c8caade65bd1fd4dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.237 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.238 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7020d685-eeb1-40f3-8e17-356f980a2298', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.238178', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae6f476-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '4a586facbfd28767f474d766fea113cd3652706610c83a053e831c4984855397'}]}, 'timestamp': '2026-02-23 10:04:56.238630', '_unique_id': '120f6687079042f9abfd17a514dfecce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.239 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60aa8e98-f9ce-43a9-9cc8-df05a87bc3cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.240708', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1ae75966-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': '296d9405d991f91237d9118741a5759835458ee2ffcccd1cbd39b58647fa0cdd'}]}, 'timestamp': '2026-02-23 10:04:56.241221', '_unique_id': '4953b6c3c21747bfbd4a38eb606c3c53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.242 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcf245ad-d37f-4a28-810a-4ae9bf21b98e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.243278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae7bb54-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': 'f2aa55c4fc81e3bcdd62cfb7b0d0719f7be766000d2454a7601b8d710264513b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.243278', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae7ccca-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': 'caed60d2e53e30935e149e6e20afb1b214f0bc2ac110f3b8b4fb65cdb234c78d'}]}, 'timestamp': '2026-02-23 10:04:56.244141', '_unique_id': '879a6b91d5034719a0ce7093f57d704b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.245 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.247 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e45d0d7-8f23-4e2c-8fbc-d415e5f170ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.246541', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae83b6a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '7a094cdff3f5c5e0189581c79e226f9c97762a861cdc2bc46f059cf77e133a86'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.246541', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae84ce0-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '8bc24c4c29a9b3e5585e3939cd43c74b574d66ada3fbff05699570510e3a2548'}]}, 'timestamp': '2026-02-23 10:04:56.247542', '_unique_id': '6b269a460906478faec06858e79a534f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.248 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f8ce283-c990-44bd-8312-d69d82b08760', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.249678', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1ae8b6c6-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': '5455cf6bb10b6629958397a5b79459ae2c08fd9b5150d61000c7608677e4d011'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.249678', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1ae8c76a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': '43912296fcf9b23406c62adb1a94bf18ebcc2f582c981601448f9156a549eff9'}]}, 'timestamp': '2026-02-23 10:04:56.250554', '_unique_id': 'def1e7182d4b45d3a181f862acfac4dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.251 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.252 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.269 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6087281f-5421-419c-9903-162e26cdc609', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:04:56.252737', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1aebc5be-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.459027746, 'message_signature': '2756da8eac731182bd9da0e5f5a562094925f3a800ec0e81dc9699c54d41832e'}]}, 'timestamp': '2026-02-23 10:04:56.270191', '_unique_id': 'ff69b64382fb49e7bbd16c51e6bee7cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.272 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.272 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.272 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26a47411-9b4c-4413-8fca-224d37da438a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.272390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1aec2c98-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '8a81802d02057f414f05d0eb703dce96680970d6aace8697ac745d19ab6d1790'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.272390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1aec3db4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '32ce533f13852ed104e4511d5b00e1e84fa2a9db76f0f865d5b864e02aa59d05'}]}, 'timestamp': '2026-02-23 10:04:56.273269', '_unique_id': '5001eab617d24b18941de5f1f54d39c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.274 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.275 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.275 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 16080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29096275-9287-44c3-814a-78729271a191', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16080000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:04:56.275404', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1aeca25e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.459027746, 'message_signature': 'e3846e6536b1ad9245f041a47297b1953894e2ea80217bcdfebf788e4a9f8b15'}]}, 'timestamp': '2026-02-23 10:04:56.275832', '_unique_id': '3b161aff77674335b22c0314d9067241'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.276 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.277 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.277 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.278 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f10561f-3bfa-4f1e-b4f5-d2df26159836', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.277890', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1aed056e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': '4d23a6e9ab04b556fff82d63966d3060fbf33305221f7c3db9a48f0837f569ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.277890', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1aed14dc-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.37311148, 'message_signature': 'ed914fb74d7b32b9793d1ab418126dc3bd3505f2f54c462189b7cefa306fdfc8'}]}, 'timestamp': '2026-02-23 10:04:56.278748', '_unique_id': '402d6a2fd7f0452db7ffaae4286823ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.279 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.280 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.280 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.281 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fe7311a-956b-4b6b-92c8-02512a69ecdc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:04:56.280831', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1aed7814-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': 'af1b948c35c0b73c422079a39508e652709c25ee9a74445ce2638d6bf06d1ffb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:04:56.280831', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1aed875a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.350025306, 'message_signature': 'f22d55c9e5fa80e5325b839fbc3f9d3e8035d8d3b218ce5e2e92202935371dbd'}]}, 'timestamp': '2026-02-23 10:04:56.281678', '_unique_id': 'ff0d3cc49095472784836b8d781c562a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.282 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.283 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dabe0ba-de86-4714-9e41-b5c2aa446245', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.283800', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1aedebfa-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'fefec436f6dd6ab139d0d64ba137939dff77e18ab898c365e87efc5507e6f44d'}]}, 'timestamp': '2026-02-23 10:04:56.284284', '_unique_id': '12c6e54fb2304fff87cc7cd8b3eba9b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.285 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.286 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41d9381c-e2f7-477f-907d-9fc530ab3eca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:04:56.286385', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '1aee4f5a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12536.339004495, 'message_signature': 'a7a91bdf65aa60b92f5c64545cde52548ae8bcc91ffabfc3234592f80b6dae28'}]}, 'timestamp': '2026-02-23 10:04:56.286829', '_unique_id': '47d04eff04914df5b7492418d51ca2b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:04:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:04:56.287 12 ERROR oslo_messaging.notify.messaging Feb 23 05:04:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0) Feb 23 05:04:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:04:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:04:57 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/11555dd1-63b1-44b4-8930-21367a0b0414/72e5f463-1431-499e-98f9-e7d184066592],prefix=session evict} (starting...) Feb 23 05:04:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 23 05:04:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:57 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:04:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:04:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e244 do_prune osdmap full prune enabled Feb 23 05:04:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 e245: 6 total, 6 up, 6 in Feb 23 05:04:57 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e245: 6 total, 6 up, 6 in Feb 23 05:04:58 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:04:58 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:04:58 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:04:58 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:04:58 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:04:58 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:04:58 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:58 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:04:58 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:04:59 localhost nova_compute[282206]: 2026-02-23 10:04:59.617 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:04:59 localhost nova_compute[282206]: 2026-02-23 10:04:59.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:05:00 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:05:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:00 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:01 localhost nova_compute[282206]: 2026-02-23 10:05:01.322 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:01 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:05:01 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:01 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:01 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:03 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:03 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:03 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:03 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:03 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:03 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8", "osd", "allow rw pool=manila_data namespace=fsvolumens_08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:04 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 23 05:05:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:05:04 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:05:04 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:05:04 localhost nova_compute[282206]: 2026-02-23 10:05:04.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:04 localhost nova_compute[282206]: 2026-02-23 10:05:04.952 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:04 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:05:04 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:05:04 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:05:04 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:05:06 localhost nova_compute[282206]: 2026-02-23 10:05:06.345 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:05:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:05:06 localhost podman[322379]: 2026-02-23 10:05:06.918240044 +0000 UTC m=+0.089430306 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 05:05:06 localhost podman[322379]: 2026-02-23 10:05:06.933260399 +0000 UTC m=+0.104450631 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.7, release=1770267347) Feb 23 05:05:06 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:05:07 localhost podman[322380]: 2026-02-23 10:05:07.025420427 +0000 UTC m=+0.190231981 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 05:05:07 localhost podman[322380]: 2026-02-23 10:05:07.063601448 +0000 UTC m=+0.228413002 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 05:05:07 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:05:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0) Feb 23 05:05:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:07 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:07 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:07 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:07 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:07 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/08fb1b39-07a9-4d7d-bcb4-cece4d84a3fc/3090d14f-d090-4924-bc2f-5b879acc23d8],prefix=session evict} (starting...) Feb 23 05:05:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:07 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:08 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:05:08 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:08 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:08 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:08 localhost nova_compute[282206]: 2026-02-23 10:05:08.306 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:09 localhost podman[242954]: time="2026-02-23T10:05:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:05:09 localhost podman[242954]: @ - - [23/Feb/2026:10:05:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:05:09 localhost podman[242954]: @ - - [23/Feb/2026:10:05:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18830 "" "Go-http-client/1.1" Feb 23 05:05:09 localhost nova_compute[282206]: 2026-02-23 10:05:09.973 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:10 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:10 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:10 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Feb 23 05:05:10 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:05:10 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:05:10 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:05:11 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:11 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:11 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:11 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Feb 23 05:05:11 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:05:11 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Feb 23 05:05:11 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Feb 23 05:05:12 localhost ovn_controller[157695]: 2026-02-23T10:05:12Z|00366|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:05:12 localhost nova_compute[282206]: 2026-02-23 10:05:12.569 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:13 localhost openstack_network_exporter[245358]: ERROR 10:05:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:05:13 localhost openstack_network_exporter[245358]: Feb 23 05:05:13 localhost openstack_network_exporter[245358]: ERROR 10:05:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:05:13 localhost openstack_network_exporter[245358]: Feb 23 05:05:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0) Feb 23 05:05:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:13 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:13 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e],prefix=session evict} (starting...) Feb 23 05:05:13 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:13 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:13 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:13 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:14 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:14 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:14 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:05:14 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:14 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:14 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:15 localhost nova_compute[282206]: 2026-02-23 10:05:15.007 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:15 localhost ovn_controller[157695]: 2026-02-23T10:05:15Z|00367|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:05:15 localhost nova_compute[282206]: 2026-02-23 10:05:15.513 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:16 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:16 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:16 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:17 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:17 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:17 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:17 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 23 05:05:17 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:05:17 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:05:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:05:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2058529482' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:05:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:05:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2058529482' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:05:17 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:05:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:18 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:05:18 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:05:18 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:05:18 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:05:20 localhost nova_compute[282206]: 2026-02-23 10:05:20.207 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0) Feb 23 05:05:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:20 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e],prefix=session evict} (starting...) Feb 23 05:05:20 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:20 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:20 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:20 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:20 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:05:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:05:20 localhost podman[322423]: 2026-02-23 10:05:20.895406232 +0000 UTC m=+0.070888414 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, container_name=ovn_controller, io.buildah.version=1.43.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 05:05:20 localhost podman[322423]: 2026-02-23 10:05:20.958557573 +0000 UTC m=+0.134039785 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true) Feb 23 05:05:20 localhost systemd[1]: tmp-crun.oLd5Jl.mount: Deactivated successfully. Feb 23 05:05:20 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:05:20 localhost podman[322424]: 2026-02-23 10:05:20.976690264 +0000 UTC m=+0.148999298 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:05:20 localhost podman[322424]: 2026-02-23 10:05:20.988438047 +0000 UTC m=+0.160747091 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:05:21 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:05:21 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:05:21 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:21 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:21 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:05:21.532 265541 INFO neutron.agent.linux.ip_lib [None req-7b396933-5536-46b6-8c4e-1d8db45bcb74 - - - - - -] Device tap97945a14-af cannot be used as it has no MAC address#033[00m Feb 23 05:05:21 localhost nova_compute[282206]: 2026-02-23 10:05:21.557 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:21 localhost kernel: device tap97945a14-af entered promiscuous mode Feb 23 05:05:21 localhost ovn_controller[157695]: 2026-02-23T10:05:21Z|00368|binding|INFO|Claiming lport 97945a14-af1a-4a6f-ba38-cf9a96201926 for this chassis. Feb 23 05:05:21 localhost ovn_controller[157695]: 2026-02-23T10:05:21Z|00369|binding|INFO|97945a14-af1a-4a6f-ba38-cf9a96201926: Claiming unknown Feb 23 05:05:21 localhost nova_compute[282206]: 2026-02-23 10:05:21.564 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:21 localhost NetworkManager[5974]: [1771841121.5732] manager: (tap97945a14-af): new Generic device (/org/freedesktop/NetworkManager/Devices/57) Feb 23 05:05:21 localhost systemd-udevd[322477]: Network interface NamePolicy= disabled on kernel command line. Feb 23 05:05:21 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:21.576 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e7be559e0474f2f877f7adf99941064', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ad7e38-c8d0-4a81-8fc5-3d8731b8b543, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=97945a14-af1a-4a6f-ba38-cf9a96201926) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:05:21 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:21.578 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 97945a14-af1a-4a6f-ba38-cf9a96201926 in datapath 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6 bound to our chassis#033[00m Feb 23 05:05:21 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:21.580 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Port ace0f608-b357-4062-b0dd-1f7d83446866 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Feb 23 05:05:21 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:21.580 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 05:05:21 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:21.581 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[000882b1-ff00-46fe-b2f6-1cda86ed765b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:05:21 localhost journal[231253]: ethtool ioctl error on tap97945a14-af: No such device Feb 23 05:05:21 localhost nova_compute[282206]: 2026-02-23 10:05:21.604 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:21 localhost ovn_controller[157695]: 2026-02-23T10:05:21Z|00370|binding|INFO|Setting lport 97945a14-af1a-4a6f-ba38-cf9a96201926 ovn-installed in OVS Feb 23 05:05:21 localhost ovn_controller[157695]: 2026-02-23T10:05:21Z|00371|binding|INFO|Setting lport 97945a14-af1a-4a6f-ba38-cf9a96201926 up in Southbound Feb 23 05:05:21 localhost nova_compute[282206]: 2026-02-23 10:05:21.608 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:21 localhost journal[231253]: ethtool ioctl error on tap97945a14-af: No such device Feb 23 05:05:21 localhost nova_compute[282206]: 2026-02-23 10:05:21.614 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:21 localhost journal[231253]: ethtool ioctl error on tap97945a14-af: No such device Feb 23 05:05:21 localhost journal[231253]: ethtool ioctl error on tap97945a14-af: No such device Feb 23 05:05:21 localhost journal[231253]: ethtool ioctl error on tap97945a14-af: No such device Feb 23 05:05:21 localhost journal[231253]: ethtool ioctl error on tap97945a14-af: No such device Feb 23 05:05:21 localhost journal[231253]: ethtool ioctl error on tap97945a14-af: No such device Feb 23 05:05:21 localhost journal[231253]: ethtool ioctl error on tap97945a14-af: No such device Feb 23 05:05:21 localhost nova_compute[282206]: 2026-02-23 10:05:21.650 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:21 localhost nova_compute[282206]: 2026-02-23 10:05:21.677 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:22 localhost nova_compute[282206]: 2026-02-23 10:05:22.185 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:22 localhost podman[322549]: Feb 23 05:05:22 localhost podman[322549]: 2026-02-23 10:05:22.581826676 +0000 UTC m=+0.092735998 container create c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 05:05:22 localhost systemd[1]: Started libpod-conmon-c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c.scope. Feb 23 05:05:22 localhost podman[322549]: 2026-02-23 10:05:22.534794582 +0000 UTC m=+0.045703944 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Feb 23 05:05:22 localhost systemd[1]: Started libcrun container. Feb 23 05:05:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0cf8b8c1279fdb39306472f422941fc3ae1cade469254f26899cabd285efde04/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Feb 23 05:05:22 localhost podman[322549]: 2026-02-23 10:05:22.650803368 +0000 UTC m=+0.161712680 container init c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true) Feb 23 05:05:22 localhost podman[322549]: 2026-02-23 10:05:22.661990204 +0000 UTC m=+0.172899546 container start c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0) Feb 23 05:05:22 localhost dnsmasq[322567]: started, version 2.85 cachesize 150 Feb 23 05:05:22 localhost dnsmasq[322567]: DNS service limited to local subnets Feb 23 05:05:22 localhost dnsmasq[322567]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Feb 23 05:05:22 localhost dnsmasq[322567]: warning: no upstream servers configured Feb 23 05:05:22 localhost dnsmasq-dhcp[322567]: DHCP, static leases only on 10.100.0.0, lease time 1d Feb 23 05:05:22 localhost dnsmasq[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/addn_hosts - 0 addresses Feb 23 05:05:22 localhost dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/host Feb 23 05:05:22 localhost dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/opts Feb 23 05:05:22 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:05:22.912 265541 INFO neutron.agent.dhcp.agent [None req-de2561f4-4069-497e-8bd9-5861df172dea - - - - - -] DHCP configuration for ports {'c899d854-97b2-4f5c-9a4f-b3ac893e22e2'} is completed#033[00m Feb 23 05:05:22 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:05:23.373 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:05:23Z, description=, device_id=36bd3448-ec4f-40e6-b201-5bd21215f6b7, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7dafe099-5e91-4531-bb08-a4050630ab61, ip_allocation=immediate, mac_address=fa:16:3e:cd:94:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:05:19Z, description=, dns_domain=, id=1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1611802295-network, port_security_enabled=True, project_id=1e7be559e0474f2f877f7adf99941064, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3734, status=ACTIVE, subnets=['5bfa3b6e-e658-4e35-be2b-09dffb18d94e'], tags=[], tenant_id=1e7be559e0474f2f877f7adf99941064, updated_at=2026-02-23T10:05:20Z, vlan_transparent=None, network_id=1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, port_security_enabled=False, project_id=1e7be559e0474f2f877f7adf99941064, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3742, status=DOWN, tags=[], tenant_id=1e7be559e0474f2f877f7adf99941064, updated_at=2026-02-23T10:05:23Z on network 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6#033[00m Feb 23 05:05:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:23 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:23 localhost dnsmasq[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/addn_hosts - 1 addresses Feb 23 05:05:23 localhost dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/host Feb 23 05:05:23 localhost dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/opts Feb 23 05:05:23 localhost podman[322585]: 2026-02-23 10:05:23.602976333 +0000 UTC m=+0.067534669 container kill c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:05:23 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:05:23.859 265541 INFO neutron.agent.dhcp.agent [None req-246d2597-6dee-4296-8c20-68a48c971254 - - - - - -] DHCP configuration for ports {'7dafe099-5e91-4531-bb08-a4050630ab61'} is completed#033[00m Feb 23 05:05:23 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:23 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:23 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:23 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:24 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Feb 23 05:05:24 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:05:24 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:05:24 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice_bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:05:24 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:05:24.305 265541 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-23T10:05:23Z, description=, device_id=36bd3448-ec4f-40e6-b201-5bd21215f6b7, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7dafe099-5e91-4531-bb08-a4050630ab61, ip_allocation=immediate, mac_address=fa:16:3e:cd:94:06, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-23T10:05:19Z, description=, dns_domain=, id=1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1611802295-network, port_security_enabled=True, project_id=1e7be559e0474f2f877f7adf99941064, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3734, status=ACTIVE, subnets=['5bfa3b6e-e658-4e35-be2b-09dffb18d94e'], tags=[], tenant_id=1e7be559e0474f2f877f7adf99941064, updated_at=2026-02-23T10:05:20Z, vlan_transparent=None, network_id=1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, port_security_enabled=False, project_id=1e7be559e0474f2f877f7adf99941064, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3742, status=DOWN, tags=[], tenant_id=1e7be559e0474f2f877f7adf99941064, updated_at=2026-02-23T10:05:23Z on network 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6#033[00m Feb 23 05:05:24 localhost dnsmasq[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/addn_hosts - 1 addresses Feb 23 05:05:24 localhost dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/host Feb 23 05:05:24 localhost dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/opts Feb 23 05:05:24 localhost podman[322622]: 2026-02-23 10:05:24.506774613 +0000 UTC m=+0.061388789 container kill c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 05:05:24 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:05:24.872 265541 INFO neutron.agent.dhcp.agent [None req-9d74762b-d024-40a5-9b82-55f464dd2f56 - - - - - -] DHCP configuration for ports {'7dafe099-5e91-4531-bb08-a4050630ab61'} is completed#033[00m Feb 23 05:05:25 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Feb 23 05:05:25 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:05:25 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Feb 23 05:05:25 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Feb 23 05:05:25 localhost nova_compute[282206]: 2026-02-23 10:05:25.241 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:26 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0) Feb 23 05:05:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:05:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:05:26 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:26 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e],prefix=session evict} (starting...) Feb 23 05:05:26 localhost podman[322643]: 2026-02-23 10:05:26.924534346 +0000 UTC m=+0.088126046 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.43.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Feb 23 05:05:26 localhost podman[322644]: 2026-02-23 10:05:26.983015494 +0000 UTC m=+0.143445856 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 23 05:05:27 localhost podman[322643]: 2026-02-23 10:05:27.006556392 +0000 UTC m=+0.170148112 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260216, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0) Feb 23 05:05:27 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:05:27 localhost podman[322644]: 2026-02-23 10:05:27.023543927 +0000 UTC m=+0.183974239 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.43.0) Feb 23 05:05:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:27 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:27 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:27 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:05:27 localhost nova_compute[282206]: 2026-02-23 10:05:27.078 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:27 localhost nova_compute[282206]: 2026-02-23 10:05:27.078 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:05:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:27 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:27 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:27 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:28 localhost nova_compute[282206]: 2026-02-23 10:05:28.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:28 localhost nova_compute[282206]: 2026-02-23 10:05:28.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:05:28 localhost nova_compute[282206]: 2026-02-23 10:05:28.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:05:28 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:05:28 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:28 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:28 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:28 localhost nova_compute[282206]: 2026-02-23 10:05:28.874 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:05:28 localhost nova_compute[282206]: 2026-02-23 10:05:28.874 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:05:28 localhost nova_compute[282206]: 2026-02-23 10:05:28.875 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:05:28 localhost nova_compute[282206]: 2026-02-23 10:05:28.875 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:05:29 localhost nova_compute[282206]: 2026-02-23 10:05:29.377 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:05:29 localhost nova_compute[282206]: 2026-02-23 10:05:29.402 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:05:29 localhost nova_compute[282206]: 2026-02-23 10:05:29.403 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:05:29 localhost nova_compute[282206]: 2026-02-23 10:05:29.683 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:30 localhost nova_compute[282206]: 2026-02-23 10:05:30.243 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:30 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:30 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:30 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-550070678", "caps": ["mds", "allow rw path=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e", "osd", "allow rw pool=manila_data namespace=fsvolumens_812d8099-4259-45c6-8802-8b5ec410d596", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:30 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 23 05:05:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:05:30 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:05:30 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:05:31 localhost nova_compute[282206]: 2026-02-23 10:05:31.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:31 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:05:31 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:05:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:05:31 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:05:32 localhost nova_compute[282206]: 2026-02-23 10:05:32.776 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:05:32 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:05:32 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:32 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0. Feb 23 05:05:32 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.986582) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 05:05:32 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67 Feb 23 05:05:32 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841132986666, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1449, "num_deletes": 251, "total_data_size": 1325057, "memory_usage": 1353056, "flush_reason": "Manual Compaction"} Feb 23 05:05:32 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started Feb 23 05:05:32 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841132996260, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 1301381, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36568, "largest_seqno": 38016, "table_properties": {"data_size": 1295011, "index_size": 3328, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 16197, "raw_average_key_size": 20, "raw_value_size": 1280910, "raw_average_value_size": 1625, "num_data_blocks": 144, "num_entries": 788, "num_filter_entries": 788, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841072, "oldest_key_time": 1771841072, "file_creation_time": 1771841132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}} Feb 23 05:05:32 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 9729 microseconds, and 4964 cpu microseconds. Feb 23 05:05:32 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.996325) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 1301381 bytes OK Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.996358) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.999948) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:32.999971) EVENT_LOG_v1 {"time_micros": 1771841132999964, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.000001) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 1317951, prev total WAL file size 1317951, number of live WAL files 2. Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.000723) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353234' seq:72057594037927935, type:22 .. '6B760031373735' seq:0, type:0; will stop at (end) Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(1270KB)], [66(18MB)] Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841133000824, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 20510826, "oldest_snapshot_seqno": -1} Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.073 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.074 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.074 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.075 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 14260 keys, 19462717 bytes, temperature: kUnknown Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841133132683, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 19462717, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19378571, "index_size": 47373, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35717, "raw_key_size": 383151, "raw_average_key_size": 26, "raw_value_size": 19133408, "raw_average_value_size": 1341, "num_data_blocks": 1771, "num_entries": 14260, "num_filter_entries": 14260, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841133, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}} Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.133143) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 19462717 bytes Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.135127) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.4 rd, 147.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 18.3 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(30.7) write-amplify(15.0) OK, records in: 14790, records dropped: 530 output_compression: NoCompression Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.135159) EVENT_LOG_v1 {"time_micros": 1771841133135144, "job": 40, "event": "compaction_finished", "compaction_time_micros": 131984, "compaction_time_cpu_micros": 63441, "output_level": 6, "num_output_files": 1, "total_output_size": 19462717, "num_input_records": 14790, "num_output_records": 14260, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841133135542, "job": 40, "event": "table_file_deletion", "file_number": 68} Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841133138450, "job": 40, "event": "table_file_deletion", "file_number": 66} Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.000575) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138601) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:05:33 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:05:33.138628) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:05:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} v 0) Feb 23 05:05:33 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:33 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:33 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=tempest-cephx-id-550070678,client_metadata.root=/volumes/_nogroup/812d8099-4259-45c6-8802-8b5ec410d596/c2facf6e-037e-4fc9-867d-c6a4c723aa9e],prefix=session evict} (starting...) Feb 23 05:05:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:05:33 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2976359123' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.592 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:05:33 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:05:33 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:05:33 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-550070678", "format": "json"} : dispatch Feb 23 05:05:33 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:33 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"} : dispatch Feb 23 05:05:33 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-550070678"}]': finished Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.673 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.674 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:05:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:33 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.890 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:05:33 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.892 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11213MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.892 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.892 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.966 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.967 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:05:33 localhost nova_compute[282206]: 2026-02-23 10:05:33.968 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:05:34 localhost nova_compute[282206]: 2026-02-23 10:05:34.016 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:05:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:05:34 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3581738185' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:05:34 localhost nova_compute[282206]: 2026-02-23 10:05:34.487 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:05:34 localhost nova_compute[282206]: 2026-02-23 10:05:34.493 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:05:34 localhost nova_compute[282206]: 2026-02-23 10:05:34.510 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:05:34 localhost nova_compute[282206]: 2026-02-23 10:05:34.511 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:05:34 localhost nova_compute[282206]: 2026-02-23 10:05:34.511 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:05:34 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:05:34 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:34 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:34 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow r pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:35 localhost nova_compute[282206]: 2026-02-23 10:05:35.246 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:35 localhost nova_compute[282206]: 2026-02-23 10:05:35.506 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:35 localhost nova_compute[282206]: 2026-02-23 10:05:35.507 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:05:35 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:05:36 localhost nova_compute[282206]: 2026-02-23 10:05:36.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e245 do_prune osdmap full prune enabled Feb 23 05:05:36 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:05:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e246 e246: 6 total, 6 up, 6 in Feb 23 05:05:36 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e246: 6 total, 6 up, 6 in Feb 23 05:05:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Feb 23 05:05:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:05:37 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:05:37 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=alice bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:05:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e246 do_prune osdmap full prune enabled Feb 23 05:05:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 e247: 6 total, 6 up, 6 in Feb 23 05:05:37 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e247: 6 total, 6 up, 6 in Feb 23 05:05:37 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Feb 23 05:05:37 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:05:37 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Feb 23 05:05:37 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Feb 23 05:05:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:05:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:05:37 localhost systemd[1]: tmp-crun.GXQahf.mount: Deactivated successfully. Feb 23 05:05:37 localhost podman[322812]: 2026-02-23 10:05:37.935643109 +0000 UTC m=+0.088213377 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:05:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:37 localhost podman[322811]: 2026-02-23 10:05:37.979070192 +0000 UTC m=+0.130984130 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9) Feb 23 05:05:37 localhost podman[322812]: 2026-02-23 10:05:37.998546514 +0000 UTC m=+0.151116782 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:05:38 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:05:38 localhost podman[322811]: 2026-02-23 10:05:38.021347189 +0000 UTC m=+0.173261147 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1770267347, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, version=9.7) Feb 23 05:05:38 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:05:38 localhost nova_compute[282206]: 2026-02-23 10:05:38.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:39 localhost podman[242954]: time="2026-02-23T10:05:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:05:39 localhost podman[242954]: @ - - [23/Feb/2026:10:05:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1" Feb 23 05:05:39 localhost podman[242954]: @ - - [23/Feb/2026:10:05:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19314 "" "Go-http-client/1.1" Feb 23 05:05:39 localhost dnsmasq[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/addn_hosts - 0 addresses Feb 23 05:05:39 localhost dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/host Feb 23 05:05:39 localhost dnsmasq-dhcp[322567]: read /var/lib/neutron/dhcp/1fd1ea36-61b2-4373-a7fa-84d2547a4ab6/opts Feb 23 05:05:39 localhost podman[322873]: 2026-02-23 10:05:39.78558504 +0000 UTC m=+0.069363096 container kill c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:05:40 localhost ovn_controller[157695]: 2026-02-23T10:05:40Z|00372|binding|INFO|Releasing lport 97945a14-af1a-4a6f-ba38-cf9a96201926 from this chassis (sb_readonly=0) Feb 23 05:05:40 localhost kernel: device tap97945a14-af left promiscuous mode Feb 23 05:05:40 localhost ovn_controller[157695]: 2026-02-23T10:05:40Z|00373|binding|INFO|Setting lport 97945a14-af1a-4a6f-ba38-cf9a96201926 down in Southbound Feb 23 05:05:40 localhost nova_compute[282206]: 2026-02-23 10:05:40.037 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:40 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:40.045 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005626463.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpfb23302c-55c1-5de0-badf-4fc1ff22837a-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1e7be559e0474f2f877f7adf99941064', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005626463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70ad7e38-c8d0-4a81-8fc5-3d8731b8b543, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=97945a14-af1a-4a6f-ba38-cf9a96201926) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:05:40 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:40.047 163572 INFO neutron.agent.ovn.metadata.agent [-] Port 97945a14-af1a-4a6f-ba38-cf9a96201926 in datapath 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6 unbound from our chassis#033[00m Feb 23 05:05:40 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:40.050 163572 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Feb 23 05:05:40 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:40.051 163675 DEBUG oslo.privsep.daemon [-] privsep: reply[b8fc5c99-fa1b-47e5-b8dd-3baf768006a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Feb 23 05:05:40 localhost nova_compute[282206]: 2026-02-23 10:05:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:40 localhost nova_compute[282206]: 2026-02-23 10:05:40.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:05:40 localhost nova_compute[282206]: 2026-02-23 10:05:40.060 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:40 localhost nova_compute[282206]: 2026-02-23 10:05:40.249 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} v 0) Feb 23 05:05:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:40 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 23 05:05:40 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:40 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"} : dispatch Feb 23 05:05:40 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02", "mon", "allow r"], "format": "json"}]': finished Feb 23 05:05:41 localhost ovn_controller[157695]: 2026-02-23T10:05:41Z|00374|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:05:41 localhost nova_compute[282206]: 2026-02-23 10:05:41.173 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:41 localhost dnsmasq[322567]: exiting on receipt of SIGTERM Feb 23 05:05:41 localhost podman[322914]: 2026-02-23 10:05:41.620130363 +0000 UTC m=+0.066166246 container kill c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:05:41 localhost systemd[1]: libpod-c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c.scope: Deactivated successfully. Feb 23 05:05:41 localhost podman[322927]: 2026-02-23 10:05:41.701192209 +0000 UTC m=+0.061866264 container died c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0) Feb 23 05:05:41 localhost podman[322927]: 2026-02-23 10:05:41.74556717 +0000 UTC m=+0.106241195 container cleanup c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:05:41 localhost systemd[1]: libpod-conmon-c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c.scope: Deactivated successfully. Feb 23 05:05:41 localhost sshd[322951]: main: sshd: ssh-rsa algorithm is disabled Feb 23 05:05:41 localhost podman[322928]: 2026-02-23 10:05:41.828493345 +0000 UTC m=+0.184397702 container remove c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1fd1ea36-61b2-4373-a7fa-84d2547a4ab6, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:05:41 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:05:41.862 265541 INFO neutron.agent.dhcp.agent [None req-00909490-9987-4e52-b130-5408b2448de2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:05:41 localhost neutron_dhcp_agent[265537]: 2026-02-23 10:05:41.865 265541 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Feb 23 05:05:42 localhost systemd[1]: var-lib-containers-storage-overlay-0cf8b8c1279fdb39306472f422941fc3ae1cade469254f26899cabd285efde04-merged.mount: Deactivated successfully. Feb 23 05:05:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c954a8ff366e93a4167ce730a1032544b49b69c5edd113d980b7645d9211801c-userdata-shm.mount: Deactivated successfully. Feb 23 05:05:42 localhost systemd[1]: run-netns-qdhcp\x2d1fd1ea36\x2d61b2\x2d4373\x2da7fa\x2d84d2547a4ab6.mount: Deactivated successfully. Feb 23 05:05:42 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:43 localhost openstack_network_exporter[245358]: ERROR 10:05:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:05:43 localhost openstack_network_exporter[245358]: Feb 23 05:05:43 localhost openstack_network_exporter[245358]: ERROR 10:05:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:05:43 localhost openstack_network_exporter[245358]: Feb 23 05:05:44 localhost ovn_controller[157695]: 2026-02-23T10:05:44Z|00375|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:05:44 localhost nova_compute[282206]: 2026-02-23 10:05:44.252 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:05:44 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:05:45 localhost nova_compute[282206]: 2026-02-23 10:05:45.252 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]} v 0) Feb 23 05:05:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]} : dispatch Feb 23 05:05:47 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]}]': finished Feb 23 05:05:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e247 do_prune osdmap full prune enabled Feb 23 05:05:47 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 e248: 6 total, 6 up, 6 in Feb 23 05:05:48 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e248: 6 total, 6 up, 6 in Feb 23 05:05:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:48.564 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:05:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:48.565 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:05:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:48.566 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:05:48 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 23 05:05:48 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]} : dispatch Feb 23 05:05:48 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]} : dispatch Feb 23 05:05:48 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae,allow rw path=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02,allow rw pool=manila_data namespace=fsvolumens_0f0c4f00-1527-445a-bf94-f839c0a6f476"]}]': finished Feb 23 05:05:48 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 23 05:05:50 localhost nova_compute[282206]: 2026-02-23 10:05:50.254 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:51 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]} v 0) Feb 23 05:05:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]} : dispatch Feb 23 05:05:51 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]}]': finished Feb 23 05:05:51 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=bob,client_metadata.root=/volumes/_nogroup/0f0c4f00-1527-445a-bf94-f839c0a6f476/15dadb0f-d665-4d06-aa15-aab1aaac7a26],prefix=session evict} (starting...) Feb 23 05:05:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:05:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:05:51 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 23 05:05:51 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]} : dispatch Feb 23 05:05:51 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]} : dispatch Feb 23 05:05:51 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae", "osd", "allow rw pool=manila_data namespace=fsvolumens_a87f3747-06f6-4188-82ee-060b8ce9fc02"]}]': finished Feb 23 05:05:51 localhost podman[322957]: 2026-02-23 10:05:51.923958019 +0000 UTC m=+0.092234923 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true) Feb 23 05:05:51 localhost podman[322957]: 2026-02-23 10:05:51.973266402 +0000 UTC m=+0.141543306 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:05:51 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:05:52 localhost podman[322958]: 2026-02-23 10:05:51.975088929 +0000 UTC m=+0.139062980 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:05:52 localhost podman[322958]: 2026-02-23 10:05:52.058081055 +0000 UTC m=+0.222055096 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:05:52 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:05:52 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:54 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0) Feb 23 05:05:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Feb 23 05:05:54 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Feb 23 05:05:54 localhost ceph-mds[286877]: mds.mds.np0005626463.qcthuc asok_command: session evict {filters=[auth_name=bob,client_metadata.root=/volumes/_nogroup/a87f3747-06f6-4188-82ee-060b8ce9fc02/83a6be9d-9350-4487-b928-33718d580aae],prefix=session evict} (starting...) Feb 23 05:05:55 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Feb 23 05:05:55 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Feb 23 05:05:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Feb 23 05:05:55 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Feb 23 05:05:55 localhost nova_compute[282206]: 2026-02-23 10:05:55.082 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:55 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:55.081 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:05:55 localhost ovn_metadata_agent[163567]: 2026-02-23 10:05:55.083 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:05:55 localhost nova_compute[282206]: 2026-02-23 10:05:55.256 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:05:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:05:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:05:57 localhost podman[323006]: 2026-02-23 10:05:57.960334919 +0000 UTC m=+0.132318642 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.43.0) Feb 23 05:05:57 localhost systemd[1]: tmp-crun.qdUabg.mount: Deactivated successfully. Feb 23 05:05:57 localhost podman[323007]: 2026-02-23 10:05:57.98041423 +0000 UTC m=+0.149898055 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Feb 23 05:05:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:05:57 localhost podman[323007]: 2026-02-23 10:05:57.989196602 +0000 UTC m=+0.158680387 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:05:58 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:05:58 localhost podman[323006]: 2026-02-23 10:05:58.047959238 +0000 UTC m=+0.219942991 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true) Feb 23 05:05:58 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:06:00 localhost nova_compute[282206]: 2026-02-23 10:06:00.259 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:02 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:03 localhost ovn_metadata_agent[163567]: 2026-02-23 10:06:03.085 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:06:05 localhost nova_compute[282206]: 2026-02-23 10:06:05.261 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:06 localhost ceph-mgr[288036]: client.0 ms_handle_reset on v2:172.18.0.107:6810/2356945423 Feb 23 05:06:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:06:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:06:08 localhost podman[323044]: 2026-02-23 10:06:08.929137465 +0000 UTC m=+0.097491836 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, io.openshift.expose-services=, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter) Feb 23 05:06:08 localhost podman[323044]: 2026-02-23 10:06:08.96586379 +0000 UTC m=+0.134218151 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1770267347, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7) Feb 23 05:06:08 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:06:08 localhost systemd[1]: tmp-crun.98HFpw.mount: Deactivated successfully. Feb 23 05:06:09 localhost podman[323045]: 2026-02-23 10:06:09.002684268 +0000 UTC m=+0.161255926 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:06:09 localhost podman[323045]: 2026-02-23 10:06:09.010534301 +0000 UTC m=+0.169105989 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 05:06:09 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:06:09 localhost podman[242954]: time="2026-02-23T10:06:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:06:09 localhost podman[242954]: @ - - [23/Feb/2026:10:06:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:06:09 localhost podman[242954]: @ - - [23/Feb/2026:10:06:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18830 "" "Go-http-client/1.1" Feb 23 05:06:09 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:06:09 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:06:10 localhost nova_compute[282206]: 2026-02-23 10:06:10.264 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:10 localhost nova_compute[282206]: 2026-02-23 10:06:10.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:10 localhost nova_compute[282206]: 2026-02-23 10:06:10.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:06:10 localhost nova_compute[282206]: 2026-02-23 10:06:10.266 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:10 localhost nova_compute[282206]: 2026-02-23 10:06:10.267 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:10 localhost nova_compute[282206]: 2026-02-23 10:06:10.269 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 5590 writes, 38K keys, 5588 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s#012Cumulative WAL: 5590 writes, 5588 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2419 writes, 11K keys, 2417 commit groups, 1.0 writes per commit group, ingest: 11.37 MB, 0.02 MB/s#012Interval WAL: 2419 writes, 2417 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 143.8 0.33 0.13 20 0.016 0 0 0.0 0.0#012 L6 1/0 18.56 MB 0.0 0.3 0.0 0.3 0.3 0.0 0.0 6.7 158.1 145.1 2.19 0.90 19 0.115 239K 9915 0.0 0.0#012 Sum 1/0 18.56 MB 0.0 0.3 0.0 0.3 0.4 0.1 0.0 7.7 137.5 144.9 2.52 1.02 39 0.065 239K 9915 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 12.2 146.0 147.5 1.01 0.45 16 0.063 110K 4310 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.3 0.0 0.3 0.3 0.0 0.0 0.0 158.1 145.1 2.19 0.90 19 0.115 239K 9915 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 145.3 0.32 0.13 19 0.017 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.6 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.046, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.36 GB write, 0.30 MB/s write, 0.34 GB read, 0.29 MB/s read, 2.5 seconds#012Interval compaction: 0.15 GB write, 0.25 MB/s write, 0.14 GB read, 0.24 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5609fbab9350#2 capacity: 304.00 MB usage: 43.85 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.00036 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2802,42.29 MB,13.9119%) FilterBlock(39,681.17 KB,0.218818%) IndexBlock(39,915.48 KB,0.294088%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0. Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.475940) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70 Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171476048, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 799, "num_deletes": 252, "total_data_size": 588734, "memory_usage": 603560, "flush_reason": "Manual Compaction"} Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171484443, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 576994, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38017, "largest_seqno": 38815, "table_properties": {"data_size": 573110, "index_size": 1611, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10096, "raw_average_key_size": 20, "raw_value_size": 564774, "raw_average_value_size": 1166, "num_data_blocks": 71, "num_entries": 484, "num_filter_entries": 484, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841133, "oldest_key_time": 1771841133, "file_creation_time": 1771841171, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}} Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 8560 microseconds, and 3257 cpu microseconds. Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.484513) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 576994 bytes OK Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.484538) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489200) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489222) EVENT_LOG_v1 {"time_micros": 1771841171489215, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489251) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 584545, prev total WAL file size 584545, number of live WAL files 2. Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489974) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end) Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(563KB)], [69(18MB)] Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171490039, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 20039711, "oldest_snapshot_seqno": -1} Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 14217 keys, 18348228 bytes, temperature: kUnknown Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171602284, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 18348228, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18265588, "index_size": 45995, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35589, "raw_key_size": 383010, "raw_average_key_size": 26, "raw_value_size": 18022353, "raw_average_value_size": 1267, "num_data_blocks": 1708, "num_entries": 14217, "num_filter_entries": 14217, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841171, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}} Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.602613) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 18348228 bytes Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.604527) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 178.4 rd, 163.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 18.6 +0.0 blob) out(17.5 +0.0 blob), read-write-amplify(66.5) write-amplify(31.8) OK, records in: 14744, records dropped: 527 output_compression: NoCompression Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.604556) EVENT_LOG_v1 {"time_micros": 1771841171604543, "job": 42, "event": "compaction_finished", "compaction_time_micros": 112359, "compaction_time_cpu_micros": 51392, "output_level": 6, "num_output_files": 1, "total_output_size": 18348228, "num_input_records": 14744, "num_output_records": 14217, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171604785, "job": 42, "event": "table_file_deletion", "file_number": 71} Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841171607500, "job": 42, "event": "table_file_deletion", "file_number": 69} Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.489857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607667) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:06:11 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:06:11.607671) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:06:12 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:13 localhost openstack_network_exporter[245358]: ERROR 10:06:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:06:13 localhost openstack_network_exporter[245358]: Feb 23 05:06:13 localhost openstack_network_exporter[245358]: ERROR 10:06:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:06:13 localhost openstack_network_exporter[245358]: Feb 23 05:06:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:06:13 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:06:14 localhost ovn_controller[157695]: 2026-02-23T10:06:14Z|00376|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory Feb 23 05:06:15 localhost nova_compute[282206]: 2026-02-23 10:06:15.269 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:06:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:06:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:20 localhost nova_compute[282206]: 2026-02-23 10:06:20.272 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:20 localhost nova_compute[282206]: 2026-02-23 10:06:20.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:20 localhost nova_compute[282206]: 2026-02-23 10:06:20.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:06:20 localhost nova_compute[282206]: 2026-02-23 10:06:20.273 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:20 localhost nova_compute[282206]: 2026-02-23 10:06:20.274 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:06:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:06:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:06:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:06:22 localhost systemd[1]: tmp-crun.79pf9O.mount: Deactivated successfully. Feb 23 05:06:22 localhost podman[323088]: 2026-02-23 10:06:22.916587597 +0000 UTC m=+0.089538879 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:06:22 localhost podman[323088]: 2026-02-23 10:06:22.980328678 +0000 UTC m=+0.153279950 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0) Feb 23 05:06:22 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:06:22 localhost podman[323089]: 2026-02-23 10:06:22.993930348 +0000 UTC m=+0.162150294 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:06:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:23 localhost podman[323089]: 2026-02-23 10:06:23.02604364 +0000 UTC m=+0.194263616 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Feb 23 05:06:23 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:06:25 localhost nova_compute[282206]: 2026-02-23 10:06:25.274 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:25 localhost nova_compute[282206]: 2026-02-23 10:06:25.276 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:28 localhost nova_compute[282206]: 2026-02-23 10:06:28.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:28 localhost nova_compute[282206]: 2026-02-23 10:06:28.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:06:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:06:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:06:28 localhost podman[323137]: 2026-02-23 10:06:28.915595091 +0000 UTC m=+0.081494421 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Feb 23 05:06:28 localhost podman[323137]: 2026-02-23 10:06:28.956250538 +0000 UTC m=+0.122149868 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 05:06:28 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:06:29 localhost podman[323136]: 2026-02-23 10:06:28.965475963 +0000 UTC m=+0.132824367 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:06:29 localhost podman[323136]: 2026-02-23 10:06:29.049390047 +0000 UTC m=+0.216738421 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true) Feb 23 05:06:29 localhost nova_compute[282206]: 2026-02-23 10:06:29.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:29 localhost nova_compute[282206]: 2026-02-23 10:06:29.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:06:29 localhost nova_compute[282206]: 2026-02-23 10:06:29.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:06:29 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:06:30 localhost nova_compute[282206]: 2026-02-23 10:06:30.276 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:30 localhost nova_compute[282206]: 2026-02-23 10:06:30.279 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:30 localhost nova_compute[282206]: 2026-02-23 10:06:30.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:06:30 localhost nova_compute[282206]: 2026-02-23 10:06:30.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:06:30 localhost nova_compute[282206]: 2026-02-23 10:06:30.483 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:06:30 localhost nova_compute[282206]: 2026-02-23 10:06:30.483 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:06:31 localhost nova_compute[282206]: 2026-02-23 10:06:31.087 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:06:31 localhost nova_compute[282206]: 2026-02-23 10:06:31.103 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:06:31 localhost nova_compute[282206]: 2026-02-23 10:06:31.103 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:06:32 localhost nova_compute[282206]: 2026-02-23 10:06:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:34 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:06:34 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.079 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.079 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.080 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:06:35 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:06:35 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.278 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.283 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:06:35 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/240892429' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.587 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.656 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.657 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:06:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:06:35 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.914 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.916 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11214MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.916 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.917 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.977 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.978 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:06:35 localhost nova_compute[282206]: 2026-02-23 10:06:35.979 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:06:36 localhost nova_compute[282206]: 2026-02-23 10:06:36.020 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:06:36 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:06:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:06:36 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3562749387' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:06:36 localhost nova_compute[282206]: 2026-02-23 10:06:36.476 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:06:36 localhost nova_compute[282206]: 2026-02-23 10:06:36.483 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:06:36 localhost nova_compute[282206]: 2026-02-23 10:06:36.505 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:06:36 localhost nova_compute[282206]: 2026-02-23 10:06:36.508 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:06:36 localhost nova_compute[282206]: 2026-02-23 10:06:36.508 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.592s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:06:37 localhost nova_compute[282206]: 2026-02-23 10:06:37.504 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:37 localhost nova_compute[282206]: 2026-02-23 10:06:37.505 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:38 localhost nova_compute[282206]: 2026-02-23 10:06:38.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:39 localhost podman[242954]: time="2026-02-23T10:06:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:06:39 localhost podman[242954]: @ - - [23/Feb/2026:10:06:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:06:39 localhost podman[242954]: @ - - [23/Feb/2026:10:06:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18826 "" "Go-http-client/1.1" Feb 23 05:06:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:06:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:06:39 localhost podman[323305]: 2026-02-23 10:06:39.916424254 +0000 UTC m=+0.086923108 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 05:06:39 localhost podman[323305]: 2026-02-23 10:06:39.954149601 +0000 UTC m=+0.124648445 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 05:06:39 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:06:40 localhost podman[323304]: 2026-02-23 10:06:39.961420685 +0000 UTC m=+0.132776675 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., distribution-scope=public, release=1770267347, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 05:06:40 localhost podman[323304]: 2026-02-23 10:06:40.045218076 +0000 UTC m=+0.216574046 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.openshift.tags=minimal rhel9) Feb 23 05:06:40 localhost nova_compute[282206]: 2026-02-23 10:06:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:40 localhost nova_compute[282206]: 2026-02-23 10:06:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:06:40 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:06:40 localhost nova_compute[282206]: 2026-02-23 10:06:40.285 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:40 localhost nova_compute[282206]: 2026-02-23 10:06:40.287 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:40 localhost nova_compute[282206]: 2026-02-23 10:06:40.287 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:06:40 localhost nova_compute[282206]: 2026-02-23 10:06:40.287 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:40 localhost nova_compute[282206]: 2026-02-23 10:06:40.310 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:40 localhost nova_compute[282206]: 2026-02-23 10:06:40.311 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:43 localhost openstack_network_exporter[245358]: ERROR 10:06:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:06:43 localhost openstack_network_exporter[245358]: Feb 23 05:06:43 localhost openstack_network_exporter[245358]: ERROR 10:06:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:06:43 localhost openstack_network_exporter[245358]: Feb 23 05:06:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:06:44 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:06:44 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:06:44 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:06:45 localhost nova_compute[282206]: 2026-02-23 10:06:45.312 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:45 localhost nova_compute[282206]: 2026-02-23 10:06:45.312 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:45 localhost nova_compute[282206]: 2026-02-23 10:06:45.313 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:06:45 localhost nova_compute[282206]: 2026-02-23 10:06:45.313 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:45 localhost nova_compute[282206]: 2026-02-23 10:06:45.349 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:45 localhost nova_compute[282206]: 2026-02-23 10:06:45.350 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:06:48.566 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:06:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:06:48.567 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:06:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:06:48.567 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:06:50 localhost nova_compute[282206]: 2026-02-23 10:06:50.351 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:50 localhost nova_compute[282206]: 2026-02-23 10:06:50.353 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:50 localhost nova_compute[282206]: 2026-02-23 10:06:50.353 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:06:50 localhost nova_compute[282206]: 2026-02-23 10:06:50.353 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:50 localhost nova_compute[282206]: 2026-02-23 10:06:50.378 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:50 localhost nova_compute[282206]: 2026-02-23 10:06:50.379 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:50 localhost nova_compute[282206]: 2026-02-23 10:06:50.381 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:06:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:06:53 localhost systemd[1]: tmp-crun.G0mkmz.mount: Deactivated successfully. Feb 23 05:06:53 localhost podman[323347]: 2026-02-23 10:06:53.92879964 +0000 UTC m=+0.097315030 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0) Feb 23 05:06:53 localhost systemd[1]: tmp-crun.oiorvI.mount: Deactivated successfully. Feb 23 05:06:53 localhost podman[323348]: 2026-02-23 10:06:53.983904954 +0000 UTC m=+0.148374018 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:06:53 localhost podman[323348]: 2026-02-23 10:06:53.992139199 +0000 UTC m=+0.156608233 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:06:54 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:06:54 localhost podman[323347]: 2026-02-23 10:06:54.04814148 +0000 UTC m=+0.216656860 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible) Feb 23 05:06:54 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:06:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e248 do_prune osdmap full prune enabled Feb 23 05:06:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e249 e249: 6 total, 6 up, 6 in Feb 23 05:06:55 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e249: 6 total, 6 up, 6 in Feb 23 05:06:55 localhost nova_compute[282206]: 2026-02-23 10:06:55.382 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:55 localhost nova_compute[282206]: 2026-02-23 10:06:55.384 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:06:55 localhost nova_compute[282206]: 2026-02-23 10:06:55.385 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:06:55 localhost nova_compute[282206]: 2026-02-23 10:06:55.385 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:55 localhost nova_compute[282206]: 2026-02-23 10:06:55.420 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:06:55 localhost nova_compute[282206]: 2026-02-23 10:06:55.421 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.148 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.181 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.182 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f71ddbd-8d31-402b-a39b-a773cc301534', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.150471', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6264dea8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '1ac67800e2f6fa5758bbe59247e50fe04c413269d640e052a970e5f477afae0f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.150471', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6264f8ca-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '1360af677187730c691f903f04dcaa038f87676732384424850bcc43a60bf1c6'}]}, 'timestamp': '2026-02-23 10:06:56.182826', '_unique_id': '843041e16a2f4acab0803fcf6434543f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.184 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.191 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2062ca99-eec0-499b-bcc9-ed2e074cc628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.186715', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '62667268-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': 'e2d8cfcf9d3a6f18a2d762b83ff7225d5d85be7f4518d5ac8ed56aca50a8b3e2'}]}, 'timestamp': '2026-02-23 10:06:56.192413', '_unique_id': '21bd71e7d55647d5b37ee2bd10514eff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.193 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.194 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.195 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '53d2c8c8-09c5-495b-b544-3f7079e3dcb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.194773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6266e69e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': 'dbbf22f2522bacb9d29950d07968a93b44813fe1fc4353f7f1846b91b317ee58'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.194773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6266f9cc-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '0dfab44c87dcefc03cc613ed20adef23d94e809432c4eb4b4adeba645643a9d8'}]}, 'timestamp': '2026-02-23 10:06:56.195847', '_unique_id': '7729a0cd57f04cecb185d2dcb884d36e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.196 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.198 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.213 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 16650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ed6dfed-b26a-4411-af9d-3f7159ac8e25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16650000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:06:56.198236', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6269c990-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.402913251, 'message_signature': '3f62761134e4663adcd01a8c6282d923e3d69559eb3d3c1a9aa538df31576053'}]}, 'timestamp': '2026-02-23 10:06:56.214337', '_unique_id': '2adef1521ba0498d97f7266543f751d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.215 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.217 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97a5b6b2-0ba5-471d-b682-5edebd470bed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.217005', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626a49f6-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '75abbcaa37df0a76fc9a637b2e7bebcf1ee443e236d84281254080a47e4df2c5'}]}, 'timestamp': '2026-02-23 10:06:56.217585', '_unique_id': '12ddcebe6b274da59c0968e31904318d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.218 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.220 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43ceea4a-b5e1-4c65-b0c5-fa0b0a591344', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.220099', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626ac20a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '7ec4ac2c1152ba89f4dd246d0a47018d967c401962c4458dc41a2445c828cf91'}]}, 'timestamp': '2026-02-23 10:06:56.220684', '_unique_id': '3336ca5306ee40e599c8256f26d82b5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.221 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.223 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e9a6908-0a09-46a6-9b43-b97218a9fe54', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.223479', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626b45e0-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': 'ec5a1a9563304b42cbb2d0009d69e92ab3ce311fb6ddc781b5114e75696a0d47'}]}, 'timestamp': '2026-02-23 10:06:56.224147', '_unique_id': 'e9856e2cf2e643b2a0b5b93f00fbfece'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.225 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.226 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0999a21c-0e56-4297-9779-c7df1122549d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.226485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '626bbc8c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': 'e38f658ae9d81fd094f63689f76ef8587a3e1e3674199d6f661b9ef10c2efba5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.226485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '626bd1fe-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': 'ee686e58d219e8798e6854064e3e0345add842b7f9df94ab182a41b84c35fc69'}]}, 'timestamp': '2026-02-23 10:06:56.227612', '_unique_id': '71c184babfb04ce79485eba8c370955e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.228 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.229 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14f1536b-9a53-46ab-94ac-fe0ac7aff734', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.229938', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626c4300-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '0232e761e5ef2a0e08cb8b647479c0fcfa0ff3b9e1c4aa0e974ec294f62a988d'}]}, 'timestamp': '2026-02-23 10:06:56.230582', '_unique_id': '9eb2b8d39daf4591897506a35415cb6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.231 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.232 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ad4cf7e-cd2f-4e05-acfe-8d5aa09e3587', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.232849', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '626cb54c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '5ea3bdb22ceaad60db5418a32b066daedb2dddc6ef722698a1dcfaaf6ae20ddc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.232849', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '626ccb54-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '2f0d8ac59c3af8eefc2267d65d3d195340c0ec081f0418b93b050f3626eb2a61'}]}, 'timestamp': '2026-02-23 10:06:56.234033', '_unique_id': 'ba09a9e48ec54debbaa745d3a7ec2dde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.236 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.236 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ac47e51-75f6-4793-b30f-49f35f594a11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.236712', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626d4d2c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '0cadbbb5ebf60617a30b3839d0f600cb7aa5b0c80c2047f9cf34059aba78ffc9'}]}, 'timestamp': '2026-02-23 10:06:56.237327', '_unique_id': 'd86db4bf474042f79ab129613f4f2305'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.250 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85754099-30e2-4321-b388-54e9a2ce21f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.239699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '626f5d60-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': '3fe8cfc956cc837215a24c89cdd002d049c2e379ac29d8393fd286ba91d951ec'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.239699', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '626f72b4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': 'a710113c08d3c203c304598db8717352566a45e1c05f57159fda8747dd668c5d'}]}, 'timestamp': '2026-02-23 10:06:56.251406', '_unique_id': '0de56de8be2444f7a49d087d5618f0a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.252 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.253 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae0af908-05c0-4ae2-a59c-efb3681857b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.253807', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '626fe8e8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '6b4bdb32c91fe218cf81b71fa5e55dda4280d2f52df135aed7fdd52cce46a85c'}]}, 'timestamp': '2026-02-23 10:06:56.254516', '_unique_id': 'da2c64731a5f4d14bf7b980ec6c74522'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.255 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.256 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32c100cc-360f-436d-a262-92e9c36dd9ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.256727', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '62705a80-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': 'ecace1a42ee472b54f1e14d3ea07dddccc34b371e90a895b5e23eac78d7279ef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.256727', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62706d90-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '4be21145db42d467795147939922802ea334bccb37dc8e14bde9bbdd7898e835'}]}, 'timestamp': '2026-02-23 10:06:56.257826', '_unique_id': '3ec20b5740ad4921a6d8fc168b6e8469'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.259 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.260 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47cea142-4bc2-4daf-bddf-7c54903183b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:06:56.260104', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6270dc6c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.402913251, 'message_signature': 'd96763d49d4d37360e316bc555317a5aaf9b3112b9ed7fd870cfb4fb64161e58'}]}, 'timestamp': '2026-02-23 10:06:56.260669', '_unique_id': 'f756c8cf879b4e678815bbfd822998e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.263 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5725afd8-7ec7-4301-af99-7915ee15a5f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.262958', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '62714bd4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': '717cd262a69cea1ac07cd9d661dee18256619c0dbc3c5b4d38a28e844ab92348'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.262958', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62715ee4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': '1f92cc7c60ccd2a708c0251c37cfff1b7c7263629fb9f0dbc80486e82d8e547c'}]}, 'timestamp': '2026-02-23 10:06:56.264029', '_unique_id': '414ff6517d1d4b7a932490345a4e7cfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.264 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03b13b39-ac4b-4cfe-a18f-c45b732d55d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.266311', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '6271cec4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': 'e36b9dc39b9f60204224a4547e8c9fc77a9d33529995f1298c30c0d3c2cf4458'}]}, 'timestamp': '2026-02-23 10:06:56.266927', '_unique_id': '75056aa0bf694287948606af0364f677'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.269 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.270 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f85a8899-0999-4704-9d27-f07eacc12479', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.269676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '627254a2-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '674869070745e6430dbd0e64ff02112e99b02139ec806867659421990ac24f57'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.269676', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '62726712-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.339967575, 'message_signature': '541592fc9cd45f89cf55d56c9368ef07670f1e85bb26e90c9d4ed385d70d4b77'}]}, 'timestamp': '2026-02-23 10:06:56.270730', '_unique_id': 'dd2b0fd37e9346db845376a2820a8d28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.271 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.273 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.273 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.274 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b84f8198-ec3f-4927-a758-64bc91ea2b61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:06:56.273750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6272f42a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': 'e35c264b881158fd5babe1f1146ceddeacf6d0800fb7eaf7e1d151d5da6d485e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:06:56.273750', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6273047e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.429211394, 'message_signature': '6733d3b20ccda55fdc71b22bd97e795b8aac34642afe729bce589fb46c258a92'}]}, 'timestamp': '2026-02-23 10:06:56.274738', '_unique_id': 'c9b82f73097348e7a99208189cb03bcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.275 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.276 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '334fc1f8-3000-4afb-900d-7c3bc56f2190', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.276154', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '62734ad8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '29b19ebc690dccdcf4970af27ccbb041b056d45f637cd794744d29d3b0cfd3e4'}]}, 'timestamp': '2026-02-23 10:06:56.276528', '_unique_id': 'f62748a750c84151b6c4383baa8e70d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.277 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5de34e47-a66b-469f-a41e-ccc0146da6ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:06:56.277916', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': '62738fe8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12656.376344259, 'message_signature': '647b81063d54a4f1ec38daf4a8e5da9ab35b7a0d3f813e3286997c7804f79a63'}]}, 'timestamp': '2026-02-23 10:06:56.278290', '_unique_id': '8ea38ff0f2f943a7b37d4d088636737a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:06:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:06:56.278 12 ERROR oslo_messaging.notify.messaging Feb 23 05:06:57 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:06:57 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:06:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:06:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e249 do_prune osdmap full prune enabled Feb 23 05:06:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e250 e250: 6 total, 6 up, 6 in Feb 23 05:06:58 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e250: 6 total, 6 up, 6 in Feb 23 05:06:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:06:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:06:59 localhost podman[323394]: 2026-02-23 10:06:59.915270388 +0000 UTC m=+0.081658516 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2) Feb 23 05:06:59 localhost podman[323394]: 2026-02-23 10:06:59.931345085 +0000 UTC m=+0.097733183 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Feb 23 05:06:59 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:07:00 localhost podman[323393]: 2026-02-23 10:07:00.025565228 +0000 UTC m=+0.195626399 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent) Feb 23 05:07:00 localhost podman[323393]: 2026-02-23 10:07:00.055568005 +0000 UTC m=+0.225629196 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Feb 23 05:07:00 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:07:00 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:07:00 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:07:00 localhost nova_compute[282206]: 2026-02-23 10:07:00.422 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:04 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Feb 23 05:07:05 localhost nova_compute[282206]: 2026-02-23 10:07:05.426 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:05 localhost nova_compute[282206]: 2026-02-23 10:07:05.428 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:05 localhost nova_compute[282206]: 2026-02-23 10:07:05.429 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:07:05 localhost nova_compute[282206]: 2026-02-23 10:07:05.429 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:05 localhost nova_compute[282206]: 2026-02-23 10:07:05.455 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:05 localhost nova_compute[282206]: 2026-02-23 10:07:05.455 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:07 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:07:07 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:07:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:09 localhost podman[242954]: time="2026-02-23T10:07:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:07:09 localhost podman[242954]: @ - - [23/Feb/2026:10:07:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:07:09 localhost podman[242954]: @ - - [23/Feb/2026:10:07:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18824 "" "Go-http-client/1.1" Feb 23 05:07:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e250 do_prune osdmap full prune enabled Feb 23 05:07:10 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e251 e251: 6 total, 6 up, 6 in Feb 23 05:07:10 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e251: 6 total, 6 up, 6 in Feb 23 05:07:10 localhost nova_compute[282206]: 2026-02-23 10:07:10.456 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:10 localhost nova_compute[282206]: 2026-02-23 10:07:10.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:10 localhost nova_compute[282206]: 2026-02-23 10:07:10.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:07:10 localhost nova_compute[282206]: 2026-02-23 10:07:10.458 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:10 localhost nova_compute[282206]: 2026-02-23 10:07:10.501 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:10 localhost nova_compute[282206]: 2026-02-23 10:07:10.501 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:07:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:07:10 localhost systemd[1]: tmp-crun.bPn3mH.mount: Deactivated successfully. Feb 23 05:07:10 localhost podman[323432]: 2026-02-23 10:07:10.920138998 +0000 UTC m=+0.092139559 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 05:07:10 localhost podman[323433]: 2026-02-23 10:07:10.965526881 +0000 UTC m=+0.133095626 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 05:07:10 localhost podman[323432]: 2026-02-23 10:07:10.984211778 +0000 UTC m=+0.156212329 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible) Feb 23 05:07:10 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:07:11 localhost podman[323433]: 2026-02-23 10:07:11.000203663 +0000 UTC m=+0.167772398 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 05:07:11 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:07:12 localhost nova_compute[282206]: 2026-02-23 10:07:12.255 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:12 localhost ovn_metadata_agent[163567]: 2026-02-23 10:07:12.256 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:07:12 localhost ovn_metadata_agent[163567]: 2026-02-23 10:07:12.258 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:07:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:13 localhost openstack_network_exporter[245358]: ERROR 10:07:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:07:13 localhost openstack_network_exporter[245358]: Feb 23 05:07:13 localhost openstack_network_exporter[245358]: ERROR 10:07:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:07:13 localhost openstack_network_exporter[245358]: Feb 23 05:07:15 localhost ovn_metadata_agent[163567]: 2026-02-23 10:07:15.260 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:07:15 localhost nova_compute[282206]: 2026-02-23 10:07:15.530 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:16 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e51: np0005626465.hlpkwo(active, since 16m), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 05:07:16 localhost sshd[323475]: main: sshd: ssh-rsa algorithm is disabled Feb 23 05:07:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e251 do_prune osdmap full prune enabled Feb 23 05:07:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 e252: 6 total, 6 up, 6 in Feb 23 05:07:18 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e252: 6 total, 6 up, 6 in Feb 23 05:07:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:07:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:07:20 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Feb 23 05:07:20 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.15723 172.18.0.34:0/506840426' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Feb 23 05:07:20 localhost nova_compute[282206]: 2026-02-23 10:07:20.532 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:20 localhost nova_compute[282206]: 2026-02-23 10:07:20.533 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:20 localhost nova_compute[282206]: 2026-02-23 10:07:20.534 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:07:20 localhost nova_compute[282206]: 2026-02-23 10:07:20.534 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:20 localhost nova_compute[282206]: 2026-02-23 10:07:20.563 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:20 localhost nova_compute[282206]: 2026-02-23 10:07:20.564 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:07:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:07:24 localhost podman[323477]: 2026-02-23 10:07:24.916143369 +0000 UTC m=+0.089714335 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2) Feb 23 05:07:24 localhost systemd[1]: tmp-crun.CYEM3O.mount: Deactivated successfully. Feb 23 05:07:24 localhost podman[323478]: 2026-02-23 10:07:24.983988936 +0000 UTC m=+0.155388795 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:07:25 localhost podman[323478]: 2026-02-23 10:07:25.021275798 +0000 UTC m=+0.192675647 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:07:25 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:07:25 localhost podman[323477]: 2026-02-23 10:07:25.071811241 +0000 UTC m=+0.245382217 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS) Feb 23 05:07:25 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:07:25 localhost nova_compute[282206]: 2026-02-23 10:07:25.564 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:25 localhost nova_compute[282206]: 2026-02-23 10:07:25.566 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:29 localhost nova_compute[282206]: 2026-02-23 10:07:29.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:29 localhost nova_compute[282206]: 2026-02-23 10:07:29.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.202 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.203 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.203 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.203 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.560 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.567 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.570 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.575 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:07:30 localhost nova_compute[282206]: 2026-02-23 10:07:30.576 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:07:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:07:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:07:30 localhost podman[323526]: 2026-02-23 10:07:30.925974749 +0000 UTC m=+0.093655477 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Feb 23 05:07:30 localhost podman[323525]: 2026-02-23 10:07:30.976043197 +0000 UTC m=+0.147381628 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:07:30 localhost podman[323526]: 2026-02-23 10:07:30.990779482 +0000 UTC m=+0.158460220 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Feb 23 05:07:31 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:07:31 localhost podman[323525]: 2026-02-23 10:07:31.009322316 +0000 UTC m=+0.180660767 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260216, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:07:31 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:07:31 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:07:31 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1412922500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:07:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:34 localhost nova_compute[282206]: 2026-02-23 10:07:34.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:07:35 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:07:35 localhost nova_compute[282206]: 2026-02-23 10:07:35.572 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:35 localhost nova_compute[282206]: 2026-02-23 10:07:35.574 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:35 localhost nova_compute[282206]: 2026-02-23 10:07:35.575 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:07:35 localhost nova_compute[282206]: 2026-02-23 10:07:35.575 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e252 do_prune osdmap full prune enabled Feb 23 05:07:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e253 e253: 6 total, 6 up, 6 in Feb 23 05:07:35 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e253: 6 total, 6 up, 6 in Feb 23 05:07:35 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:07:35 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:07:35 localhost nova_compute[282206]: 2026-02-23 10:07:35.614 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:35 localhost nova_compute[282206]: 2026-02-23 10:07:35.615 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:35 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:07:35 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.075 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.076 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.095 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.095 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.096 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.096 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.097 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:07:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:07:36 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/6716228' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.589 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:07:36 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.650 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.651 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.813 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.815 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11200MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.815 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.815 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.867 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.868 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.868 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:07:36 localhost nova_compute[282206]: 2026-02-23 10:07:36.907 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:07:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:07:37 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/63466345' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:07:37 localhost nova_compute[282206]: 2026-02-23 10:07:37.382 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:07:37 localhost nova_compute[282206]: 2026-02-23 10:07:37.390 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:07:37 localhost nova_compute[282206]: 2026-02-23 10:07:37.409 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:07:37 localhost nova_compute[282206]: 2026-02-23 10:07:37.412 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:07:37 localhost nova_compute[282206]: 2026-02-23 10:07:37.413 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:07:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e253 do_prune osdmap full prune enabled Feb 23 05:07:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e254 e254: 6 total, 6 up, 6 in Feb 23 05:07:38 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e254: 6 total, 6 up, 6 in Feb 23 05:07:38 localhost nova_compute[282206]: 2026-02-23 10:07:38.413 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:38 localhost nova_compute[282206]: 2026-02-23 10:07:38.414 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:39 localhost podman[242954]: time="2026-02-23T10:07:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:07:39 localhost podman[242954]: @ - - [23/Feb/2026:10:07:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:07:39 localhost podman[242954]: @ - - [23/Feb/2026:10:07:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18826 "" "Go-http-client/1.1" Feb 23 05:07:40 localhost nova_compute[282206]: 2026-02-23 10:07:40.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e254 do_prune osdmap full prune enabled Feb 23 05:07:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e255 e255: 6 total, 6 up, 6 in Feb 23 05:07:40 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e255: 6 total, 6 up, 6 in Feb 23 05:07:40 localhost nova_compute[282206]: 2026-02-23 10:07:40.615 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:40 localhost nova_compute[282206]: 2026-02-23 10:07:40.638 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:40 localhost nova_compute[282206]: 2026-02-23 10:07:40.638 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:07:40 localhost nova_compute[282206]: 2026-02-23 10:07:40.639 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:40 localhost nova_compute[282206]: 2026-02-23 10:07:40.639 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:40 localhost nova_compute[282206]: 2026-02-23 10:07:40.640 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:07:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:07:41 localhost podman[323693]: 2026-02-23 10:07:41.920390938 +0000 UTC m=+0.086233027 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 05:07:41 localhost podman[323693]: 2026-02-23 10:07:41.929918452 +0000 UTC m=+0.095760541 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Feb 23 05:07:41 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:07:42 localhost podman[323692]: 2026-02-23 10:07:42.019936525 +0000 UTC m=+0.188410115 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, version=9.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 05:07:42 localhost nova_compute[282206]: 2026-02-23 10:07:42.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:07:42 localhost podman[323692]: 2026-02-23 10:07:42.061324085 +0000 UTC m=+0.229797665 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Feb 23 05:07:42 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:07:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:43 localhost openstack_network_exporter[245358]: ERROR 10:07:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:07:43 localhost openstack_network_exporter[245358]: Feb 23 05:07:43 localhost openstack_network_exporter[245358]: ERROR 10:07:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:07:43 localhost openstack_network_exporter[245358]: Feb 23 05:07:45 localhost nova_compute[282206]: 2026-02-23 10:07:45.641 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:45 localhost nova_compute[282206]: 2026-02-23 10:07:45.643 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:45 localhost nova_compute[282206]: 2026-02-23 10:07:45.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:07:45 localhost nova_compute[282206]: 2026-02-23 10:07:45.644 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:45 localhost nova_compute[282206]: 2026-02-23 10:07:45.671 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:45 localhost nova_compute[282206]: 2026-02-23 10:07:45.671 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e255 do_prune osdmap full prune enabled Feb 23 05:07:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e256 e256: 6 total, 6 up, 6 in Feb 23 05:07:48 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e256: 6 total, 6 up, 6 in Feb 23 05:07:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:07:48.568 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:07:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:07:48.568 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:07:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:07:48.569 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:07:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e256 do_prune osdmap full prune enabled Feb 23 05:07:49 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e257 e257: 6 total, 6 up, 6 in Feb 23 05:07:49 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e257: 6 total, 6 up, 6 in Feb 23 05:07:50 localhost nova_compute[282206]: 2026-02-23 10:07:50.672 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:50 localhost nova_compute[282206]: 2026-02-23 10:07:50.673 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:50 localhost nova_compute[282206]: 2026-02-23 10:07:50.673 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:07:50 localhost nova_compute[282206]: 2026-02-23 10:07:50.673 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:50 localhost nova_compute[282206]: 2026-02-23 10:07:50.705 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:50 localhost nova_compute[282206]: 2026-02-23 10:07:50.705 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e257 do_prune osdmap full prune enabled Feb 23 05:07:55 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e258 e258: 6 total, 6 up, 6 in Feb 23 05:07:55 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e258: 6 total, 6 up, 6 in Feb 23 05:07:55 localhost nova_compute[282206]: 2026-02-23 10:07:55.707 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:55 localhost nova_compute[282206]: 2026-02-23 10:07:55.708 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:07:55 localhost nova_compute[282206]: 2026-02-23 10:07:55.709 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:07:55 localhost nova_compute[282206]: 2026-02-23 10:07:55.709 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:55 localhost nova_compute[282206]: 2026-02-23 10:07:55.736 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:07:55 localhost nova_compute[282206]: 2026-02-23 10:07:55.737 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:07:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:07:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:07:55 localhost systemd[1]: tmp-crun.4Bv5rm.mount: Deactivated successfully. Feb 23 05:07:55 localhost podman[323736]: 2026-02-23 10:07:55.924584572 +0000 UTC m=+0.097667591 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Feb 23 05:07:55 localhost podman[323737]: 2026-02-23 10:07:55.970428428 +0000 UTC m=+0.139431991 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:07:55 localhost podman[323737]: 2026-02-23 10:07:55.982288595 +0000 UTC m=+0.151292168 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 05:07:55 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:07:56 localhost podman[323736]: 2026-02-23 10:07:56.009542238 +0000 UTC m=+0.182625317 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Feb 23 05:07:56 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:07:56 localhost systemd[1]: tmp-crun.qP82XC.mount: Deactivated successfully. Feb 23 05:07:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:07:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e258 do_prune osdmap full prune enabled Feb 23 05:07:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e259 e259: 6 total, 6 up, 6 in Feb 23 05:07:58 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e259: 6 total, 6 up, 6 in Feb 23 05:08:00 localhost nova_compute[282206]: 2026-02-23 10:08:00.738 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:00 localhost nova_compute[282206]: 2026-02-23 10:08:00.740 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:00 localhost nova_compute[282206]: 2026-02-23 10:08:00.740 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:08:00 localhost nova_compute[282206]: 2026-02-23 10:08:00.740 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:00 localhost nova_compute[282206]: 2026-02-23 10:08:00.741 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:00 localhost nova_compute[282206]: 2026-02-23 10:08:00.744 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:08:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:08:01 localhost podman[323785]: 2026-02-23 10:08:01.908197111 +0000 UTC m=+0.077337152 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216) Feb 23 05:08:01 localhost podman[323785]: 2026-02-23 10:08:01.943448261 +0000 UTC m=+0.112588312 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260216) Feb 23 05:08:01 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:08:02 localhost podman[323786]: 2026-02-23 10:08:02.032150113 +0000 UTC m=+0.198729734 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:08:02 localhost podman[323786]: 2026-02-23 10:08:02.047151607 +0000 UTC m=+0.213731258 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Feb 23 05:08:02 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:08:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e259 do_prune osdmap full prune enabled Feb 23 05:08:05 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e260 e260: 6 total, 6 up, 6 in Feb 23 05:08:05 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e260: 6 total, 6 up, 6 in Feb 23 05:08:05 localhost nova_compute[282206]: 2026-02-23 10:08:05.743 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:05 localhost nova_compute[282206]: 2026-02-23 10:08:05.745 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:05 localhost nova_compute[282206]: 2026-02-23 10:08:05.745 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:08:05 localhost nova_compute[282206]: 2026-02-23 10:08:05.746 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:05 localhost nova_compute[282206]: 2026-02-23 10:08:05.784 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:05 localhost nova_compute[282206]: 2026-02-23 10:08:05.785 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e260 do_prune osdmap full prune enabled Feb 23 05:08:06 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e261 e261: 6 total, 6 up, 6 in Feb 23 05:08:06 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e261: 6 total, 6 up, 6 in Feb 23 05:08:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:09 localhost podman[242954]: time="2026-02-23T10:08:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:08:09 localhost podman[242954]: @ - - [23/Feb/2026:10:08:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:08:09 localhost podman[242954]: @ - - [23/Feb/2026:10:08:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18831 "" "Go-http-client/1.1" Feb 23 05:08:10 localhost ovn_metadata_agent[163567]: 2026-02-23 10:08:10.691 163572 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '22:68:bc', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:19:65:94:49:af'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Feb 23 05:08:10 localhost ovn_metadata_agent[163567]: 2026-02-23 10:08:10.692 163572 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Feb 23 05:08:10 localhost nova_compute[282206]: 2026-02-23 10:08:10.725 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:10 localhost nova_compute[282206]: 2026-02-23 10:08:10.786 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:10 localhost nova_compute[282206]: 2026-02-23 10:08:10.788 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:08:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:08:12 localhost podman[323822]: 2026-02-23 10:08:12.900795761 +0000 UTC m=+0.076766614 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, version=9.7, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., release=1770267347, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c) Feb 23 05:08:12 localhost podman[323822]: 2026-02-23 10:08:12.916306041 +0000 UTC m=+0.092276954 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, release=1770267347, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Feb 23 05:08:12 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:08:12 localhost systemd[1]: tmp-crun.XOmIoc.mount: Deactivated successfully. Feb 23 05:08:12 localhost podman[323823]: 2026-02-23 10:08:12.963195671 +0000 UTC m=+0.136268034 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Feb 23 05:08:12 localhost podman[323823]: 2026-02-23 10:08:12.999352759 +0000 UTC m=+0.172425102 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Feb 23 05:08:13 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:08:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e261 do_prune osdmap full prune enabled Feb 23 05:08:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 e262: 6 total, 6 up, 6 in Feb 23 05:08:13 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : osdmap e262: 6 total, 6 up, 6 in Feb 23 05:08:13 localhost openstack_network_exporter[245358]: ERROR 10:08:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:08:13 localhost openstack_network_exporter[245358]: Feb 23 05:08:13 localhost openstack_network_exporter[245358]: ERROR 10:08:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:08:13 localhost openstack_network_exporter[245358]: Feb 23 05:08:15 localhost nova_compute[282206]: 2026-02-23 10:08:15.817 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:15 localhost nova_compute[282206]: 2026-02-23 10:08:15.819 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:15 localhost nova_compute[282206]: 2026-02-23 10:08:15.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:08:15 localhost nova_compute[282206]: 2026-02-23 10:08:15.820 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:15 localhost nova_compute[282206]: 2026-02-23 10:08:15.821 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:15 localhost nova_compute[282206]: 2026-02-23 10:08:15.825 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:16 localhost ovn_metadata_agent[163567]: 2026-02-23 10:08:16.694 163572 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=96b5bb93-7341-4ce6-9b93-6a5de566c711, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Feb 23 05:08:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Feb 23 05:08:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2040120774' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Feb 23 05:08:17 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Feb 23 05:08:17 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2040120774' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Feb 23 05:08:18 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0. Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.126359) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73 Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298126400, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1882, "num_deletes": 258, "total_data_size": 1999982, "memory_usage": 2134784, "flush_reason": "Manual Compaction"} Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298136602, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 1470944, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38816, "largest_seqno": 40697, "table_properties": {"data_size": 1464544, "index_size": 3293, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 17637, "raw_average_key_size": 21, "raw_value_size": 1450143, "raw_average_value_size": 1794, "num_data_blocks": 144, "num_entries": 808, "num_filter_entries": 808, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841173, "oldest_key_time": 1771841173, "file_creation_time": 1771841298, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}} Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 10310 microseconds, and 4737 cpu microseconds. Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.136670) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 1470944 bytes OK Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.136694) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.140539) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.140559) EVENT_LOG_v1 {"time_micros": 1771841298140553, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.140583) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 1991782, prev total WAL file size 1992272, number of live WAL files 2. Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.141450) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323533' seq:72057594037927935, type:22 .. '6D6772737461740034353035' seq:0, type:0; will stop at (end) Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(1436KB)], [72(17MB)] Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298141493, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 19819172, "oldest_snapshot_seqno": -1} Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 14548 keys, 18170398 bytes, temperature: kUnknown Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298251013, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 18170398, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18088153, "index_size": 44768, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36421, "raw_key_size": 390711, "raw_average_key_size": 26, "raw_value_size": 17841712, "raw_average_value_size": 1226, "num_data_blocks": 1656, "num_entries": 14548, "num_filter_entries": 14548, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841298, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}} Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.251375) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 18170398 bytes Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.257548) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.8 rd, 165.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 17.5 +0.0 blob) out(17.3 +0.0 blob), read-write-amplify(25.8) write-amplify(12.4) OK, records in: 15025, records dropped: 477 output_compression: NoCompression Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.257576) EVENT_LOG_v1 {"time_micros": 1771841298257563, "job": 44, "event": "compaction_finished", "compaction_time_micros": 109618, "compaction_time_cpu_micros": 52698, "output_level": 6, "num_output_files": 1, "total_output_size": 18170398, "num_input_records": 15025, "num_output_records": 14548, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298257999, "job": 44, "event": "table_file_deletion", "file_number": 74} Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841298260762, "job": 44, "event": "table_file_deletion", "file_number": 72} Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.141367) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260914) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:18 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:18.260923) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:20 localhost nova_compute[282206]: 2026-02-23 10:08:20.069 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:20 localhost nova_compute[282206]: 2026-02-23 10:08:20.822 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:23 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:23 localhost nova_compute[282206]: 2026-02-23 10:08:23.511 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:25 localhost nova_compute[282206]: 2026-02-23 10:08:25.826 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:08:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:08:26 localhost systemd[1]: tmp-crun.sY3QB4.mount: Deactivated successfully. Feb 23 05:08:26 localhost podman[323867]: 2026-02-23 10:08:26.922010427 +0000 UTC m=+0.096672280 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260216, tcib_build_tag=8419493e1fd846703d277695e03fc5eb) Feb 23 05:08:26 localhost ovn_controller[157695]: 2026-02-23T10:08:26Z|00377|binding|INFO|Releasing lport 4143c8ea-7577-4792-9744-bcff90eb20f2 from this chassis (sb_readonly=0) Feb 23 05:08:26 localhost podman[323868]: 2026-02-23 10:08:26.991919037 +0000 UTC m=+0.163004050 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Feb 23 05:08:27 localhost nova_compute[282206]: 2026-02-23 10:08:27.006 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:27 localhost podman[323867]: 2026-02-23 10:08:27.022209124 +0000 UTC m=+0.196870957 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.build-date=20260216, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Feb 23 05:08:27 localhost podman[323868]: 2026-02-23 10:08:27.02824258 +0000 UTC m=+0.199327593 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:08:27 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:08:27 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:08:28 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:30 localhost nova_compute[282206]: 2026-02-23 10:08:30.868 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:31 localhost nova_compute[282206]: 2026-02-23 10:08:31.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:31 localhost nova_compute[282206]: 2026-02-23 10:08:31.054 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.055 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.056 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.482 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.483 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquired lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.483 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.484 282211 DEBUG nova.objects.instance [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c2a7d92b-952f-46a7-8a6a-3322a48fcf4b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Feb 23 05:08:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:08:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:08:32 localhost systemd[1]: tmp-crun.oP3OL8.mount: Deactivated successfully. Feb 23 05:08:32 localhost podman[323918]: 2026-02-23 10:08:32.913437047 +0000 UTC m=+0.085208235 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260216) Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.923 282211 DEBUG nova.network.neutron [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updating instance_info_cache with network_info: [{"id": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "address": "fa:16:3e:a0:9d:00", "network": {"id": "9da5b53d-3184-450f-9a5b-bdba1a6c9f6d", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "37b8098efb0d4ecc90b451a2db0e966f", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa27e5011-20", "ovs_interfaceid": "a27e5011-2016-4b16-b5e8-04b555b30bc4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.944 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Releasing lock "refresh_cache-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Feb 23 05:08:32 localhost nova_compute[282206]: 2026-02-23 10:08:32.945 282211 DEBUG nova.compute.manager [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] [instance: c2a7d92b-952f-46a7-8a6a-3322a48fcf4b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Feb 23 05:08:32 localhost podman[323919]: 2026-02-23 10:08:32.965245829 +0000 UTC m=+0.133766576 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:08:32 localhost podman[323919]: 2026-02-23 10:08:32.979359286 +0000 UTC m=+0.147880003 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS) Feb 23 05:08:32 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:08:32 localhost podman[323918]: 2026-02-23 10:08:32.995295088 +0000 UTC m=+0.167066286 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, container_name=ovn_metadata_agent) Feb 23 05:08:33 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:08:33 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:35 localhost nova_compute[282206]: 2026-02-23 10:08:35.870 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:35 localhost nova_compute[282206]: 2026-02-23 10:08:35.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:35 localhost nova_compute[282206]: 2026-02-23 10:08:35.872 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:08:35 localhost nova_compute[282206]: 2026-02-23 10:08:35.873 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:35 localhost nova_compute[282206]: 2026-02-23 10:08:35.912 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:35 localhost nova_compute[282206]: 2026-02-23 10:08:35.913 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:36 localhost nova_compute[282206]: 2026-02-23 10:08:36.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:36 localhost ceph-mon[294160]: log_channel(cluster) log [DBG] : mgrmap e52: np0005626465.hlpkwo(active, since 17m), standbys: np0005626463.wtksup, np0005626466.nisqfq Feb 23 05:08:36 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Feb 23 05:08:36 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.050 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.053 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.077 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.078 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.078 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Auditing locally available compute resources for np0005626463.localdomain (node: np0005626463.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.079 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:08:37 localhost ceph-mon[294160]: from='mgr.27078 172.18.0.107:0/3454997775' entity='mgr.np0005626465.hlpkwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Feb 23 05:08:37 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:08:37 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:08:37 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1622287573' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.559 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.618 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.619 282211 DEBUG nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.887 282211 WARNING nova.virt.libvirt.driver [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.888 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Hypervisor/Node resource view: name=np0005626463.localdomain free_ram=11199MB free_disk=41.8366584777832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.889 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:08:37 localhost nova_compute[282206]: 2026-02-23 10:08:37.889 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:08:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.208 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Instance c2a7d92b-952f-46a7-8a6a-3322a48fcf4b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.209 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.209 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Final resource view: name=np0005626463.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.240 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Feb 23 05:08:38 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Feb 23 05:08:38 localhost ceph-mon[294160]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2678903232' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.704 282211 DEBUG oslo_concurrency.processutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.712 282211 DEBUG nova.compute.provider_tree [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed in ProviderTree for provider: be63d86c-a403-4ec9-a515-07ea2962cb4d update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.733 282211 DEBUG nova.scheduler.client.report [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Inventory has not changed for provider be63d86c-a403-4ec9-a515-07ea2962cb4d based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.736 282211 DEBUG nova.compute.resource_tracker [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Compute_service record updated for np0005626463.localdomain:np0005626463.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Feb 23 05:08:38 localhost nova_compute[282206]: 2026-02-23 10:08:38.736 282211 DEBUG oslo_concurrency.lockutils [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:08:39 localhost podman[242954]: time="2026-02-23T10:08:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:08:39 localhost podman[242954]: @ - - [23/Feb/2026:10:08:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:08:39 localhost podman[242954]: @ - - [23/Feb/2026:10:08:39 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18822 "" "Go-http-client/1.1" Feb 23 05:08:39 localhost nova_compute[282206]: 2026-02-23 10:08:39.738 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:40 localhost ceph-mon[294160]: mon.np0005626463@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Feb 23 05:08:40 localhost ceph-mon[294160]: log_channel(audit) log [INF] : from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:08:40 localhost nova_compute[282206]: 2026-02-23 10:08:40.914 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:40 localhost nova_compute[282206]: 2026-02-23 10:08:40.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:40 localhost nova_compute[282206]: 2026-02-23 10:08:40.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:08:40 localhost nova_compute[282206]: 2026-02-23 10:08:40.916 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:40 localhost nova_compute[282206]: 2026-02-23 10:08:40.945 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:40 localhost nova_compute[282206]: 2026-02-23 10:08:40.946 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:41 localhost nova_compute[282206]: 2026-02-23 10:08:41.054 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:41 localhost ceph-mon[294160]: from='mgr.27078 ' entity='mgr.np0005626465.hlpkwo' Feb 23 05:08:42 localhost nova_compute[282206]: 2026-02-23 10:08:42.055 282211 DEBUG oslo_service.periodic_task [None req-89d95007-935e-4c7b-b26e-9452110ab2e7 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Feb 23 05:08:43 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:43 localhost openstack_network_exporter[245358]: ERROR 10:08:43 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:08:43 localhost openstack_network_exporter[245358]: Feb 23 05:08:43 localhost openstack_network_exporter[245358]: ERROR 10:08:43 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:08:43 localhost openstack_network_exporter[245358]: Feb 23 05:08:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:08:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:08:43 localhost podman[324086]: 2026-02-23 10:08:43.915749227 +0000 UTC m=+0.079969423 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, vendor=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Feb 23 05:08:43 localhost podman[324086]: 2026-02-23 10:08:43.932272148 +0000 UTC m=+0.096492334 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, release=1770267347, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z) Feb 23 05:08:43 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:08:44 localhost podman[324087]: 2026-02-23 10:08:44.018997479 +0000 UTC m=+0.179977035 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 05:08:44 localhost podman[324087]: 2026-02-23 10:08:44.056182029 +0000 UTC m=+0.217161485 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:08:44 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:08:45 localhost nova_compute[282206]: 2026-02-23 10:08:45.947 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:45 localhost nova_compute[282206]: 2026-02-23 10:08:45.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:45 localhost nova_compute[282206]: 2026-02-23 10:08:45.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:08:45 localhost nova_compute[282206]: 2026-02-23 10:08:45.949 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:45 localhost nova_compute[282206]: 2026-02-23 10:08:45.984 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:45 localhost nova_compute[282206]: 2026-02-23 10:08:45.984 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:48 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:08:48.568 163572 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Feb 23 05:08:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:08:48.569 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Feb 23 05:08:48 localhost ovn_metadata_agent[163567]: 2026-02-23 10:08:48.570 163572 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Feb 23 05:08:50 localhost nova_compute[282206]: 2026-02-23 10:08:50.986 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:50 localhost nova_compute[282206]: 2026-02-23 10:08:50.988 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:50 localhost nova_compute[282206]: 2026-02-23 10:08:50.988 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:08:50 localhost nova_compute[282206]: 2026-02-23 10:08:50.988 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:51 localhost nova_compute[282206]: 2026-02-23 10:08:51.014 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:51 localhost nova_compute[282206]: 2026-02-23 10:08:51.015 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:08:53 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:56 localhost nova_compute[282206]: 2026-02-23 10:08:56.016 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:08:56 localhost nova_compute[282206]: 2026-02-23 10:08:56.018 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.152 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'name': 'test', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'np0005626463.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '37b8098efb0d4ecc90b451a2db0e966f', 'user_id': 'cb6895487918456aa599ca2f76872d00', 'hostId': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.153 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.173 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/memory.usage volume: 51.72265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8009e87f-a614-45b8-9874-f65c73dda896', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.72265625, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:08:56.154104', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a9ea2530-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.362539547, 'message_signature': '63a9e5969553b9963b4750d37752251eb8a57dd7959e011d50403b12ef79498f'}]}, 'timestamp': '2026-02-23 10:08:56.173791', '_unique_id': 'cac2a529356e4d27b7159af96e48050f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.175 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.187 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.187 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa3ef6fe-79d9-4779-bc0a-bec864c64fac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.176777', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9ec4ab8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '4f7e10bcedbb4f441b488605291f9f69ff467e53490bb71cb7911a8707890b0d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.176777', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9ec5d8c-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '405e5d8affb23fdfc6683f704cf5b1f756fdc133f8c4b8b52122781e87b26c8c'}]}, 'timestamp': '2026-02-23 10:08:56.188251', '_unique_id': '626b37d1370048d89cd04efbdc44f6b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.189 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.193 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4021d904-30f1-40af-9ef0-d84b9c1615e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.190463', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9ed51b0-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '850d38692b39139cff8b20d25d9f6c9d58f890e7f269d02c5b678d1d899d1eb8'}]}, 'timestamp': '2026-02-23 10:08:56.194541', '_unique_id': '1c266a4fe67844abb1720a1c0d4a2e02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.195 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.227 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 1054797520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.228 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.latency volume: 21338362 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca80bf58-e775-4564-9550-374a1a1ed391', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1054797520, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.196712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f26b50-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'cc69b18a833af66eb028514277d2fed2d70b2254adbf1968e47a94071ca3471e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21338362, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.196712', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f27f0a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '8ccf9a6da0926b4f9cca130d564367600a154e343db6f3d2d44772c1feb3cd5d'}]}, 'timestamp': '2026-02-23 10:08:56.228431', '_unique_id': '962e7c11dd3c4510a9eee7d0cbeab2db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.229 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.231 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1926fbbf-5d1e-4242-a02c-e5d432048fc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.231171', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f2fcc8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'a426efd7c2334118de754c322afd8637318960a61919b6f435a36f33e0be1864'}]}, 'timestamp': '2026-02-23 10:08:56.231689', '_unique_id': 'a59248e0a5054bc3baf85dcde6af5a33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.232 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.233 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 35597312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.234 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7a7fde6-5f7a-4cbe-9b9c-ba2fd8307afc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35597312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.233833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f364b0-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'a705188eba0a6a103b0fe2446a02cd130ede5377304bc809d786dd2ca6e385a7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.233833', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f374b4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'ab2b2e505972d1274fc138f950eaffdec717bf3cb51ebedd2095a1793c62dcbd'}]}, 'timestamp': '2026-02-23 10:08:56.234712', '_unique_id': 'ca0803c1ad2f4bb7a6645c50cdae27bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.235 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.237 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f33bdc4-b4bc-4b92-b3c9-dc5c5746cb20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.237200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f3e688-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '7f15ad1934af7fa3359d061a0d15ac2029a1a934392a6ac2065bf12d71122914'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.237200', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f3f65a-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '001fb09de9f34673f98bb36c555c64f132b7e86c086616cfb44f083c5015f12e'}]}, 'timestamp': '2026-02-23 10:08:56.238066', '_unique_id': 'ee9f3d147cec4583848c256daef701fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.238 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.240 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0adc0112-2245-4f44-ba97-a635f859c225', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.240214', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f45c08-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '8dd3930324257d99bf7458cae1264116f1f3a16660be3b26ada51f2c7deb5d2b'}]}, 'timestamp': '2026-02-23 10:08:56.240665', '_unique_id': '5e6082886e194950a5f1562ad5603176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.241 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.242 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 1374424344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.243 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.latency volume: 89322858 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaee693a-928e-40bf-aa25-642005ebf13c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1374424344, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.242905', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f4c580-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '68ad7f26fe493987044aba8043223a4d5f4fba6a3d17e1d6c45667831382b846'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 89322858, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.242905', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f4d6c4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'd1f4716fd2c547f6b258048db36b77162643f627c31959cb3c92fbe68df742be'}]}, 'timestamp': '2026-02-23 10:08:56.243779', '_unique_id': '2c9a4d0a712b47d28e1d67fe518795cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.244 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.245 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.246 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '581be7b9-4c1c-4db1-9932-61fbf6f84d22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.245944', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f53c22-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'b067b447287271ddaeeb0b44b93ad87fcd4b0883c68b3d949fc713e5b2685c95'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.245944', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f54bfe-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '7fc7b8a1a08467f9f34d60ec61f5ab456d47738e9c700d598abc55fd741d18d4'}]}, 'timestamp': '2026-02-23 10:08:56.246780', '_unique_id': 'f9b9a84ae8704229b3795e33bc08d914'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.247 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.249 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb3ff36c-bdf0-4725-a607-3c10b92d35cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.249106', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f5b7c4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'a094e24746ed9b037dc41147886c81841c14e8d83b7209e265d4524713be18e0'}]}, 'timestamp': '2026-02-23 10:08:56.249568', '_unique_id': '2fa164e015754bd487ef2f42fc0d60ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.250 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.251 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.252 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd2c4313-e3d1-4520-8f7c-7aa0f51a26ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.251646', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f61bba-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': 'bc582ec99d6dd56a97d24c7ecab339602976ea113e8f6ce5b366c8615136d152'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.251646', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f62bb4-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.366260761, 'message_signature': '93fc29ba3004824282dfbc2ba13ed420bc5ec6362188aaae45c43a2eab10c8ee'}]}, 'timestamp': '2026-02-23 10:08:56.252505', '_unique_id': 'daf4f9114c344034a87c48834f434445'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.253 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.254 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd0444b0-c5e1-40dc-8812-ddf558440aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.254741', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f694c8-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '61b6c0dbd547b51e28566400e9d920174e2d1926ed1a9fd563f457c26e42cc73'}]}, 'timestamp': '2026-02-23 10:08:56.255225', '_unique_id': '18e0c76b132644cb893e3b0af0d09022'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.256 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.257 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae859397-b11a-4ede-b73e-fd67edf9b1d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.257290', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f6f756-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'db128ceecb296bd80da8d0a351f39b810e2651255ff80e2160239f114a4a2647'}]}, 'timestamp': '2026-02-23 10:08:56.257748', '_unique_id': '06b23ba9230f4f1e94919d4505220a64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.258 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.259 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.259 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3deb007b-77fb-493b-8a5c-d105d221c750', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.259804', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f75afc-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '6ce56ce8d968108711e3d6d5f5c4fd0103003ba684267de6b16e0cf075afaf98'}]}, 'timestamp': '2026-02-23 10:08:56.260300', '_unique_id': 'a6db04f669d044fca298bace61e1e362'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.261 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.262 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f02ec09-d1cf-4884-9874-d6ca8788779f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.262353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f7bd44-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'fa286d883b9791a2c32133bcc19de00e3192d942f0c3955fe6c2dbdaa5b15198'}]}, 'timestamp': '2026-02-23 10:08:56.262815', '_unique_id': '82ddf7ad7a8145ca9101abfd714262f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.263 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.264 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.264 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/cpu volume: 17210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e98e97df-f251-4eb0-b847-45f836090e14', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17210000000, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'timestamp': '2026-02-23T10:08:56.264612', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a9f81190-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.362539547, 'message_signature': 'e44701d7f3e779cfa1c69218679ebeced872c7826ff5b9701c1167493d26334d'}]}, 'timestamp': '2026-02-23 10:08:56.264899', '_unique_id': 'a003eaa6596d4c8d9a46257ba4eaebe8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.265 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.266 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 1283 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.266 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ec4ca58-4f04-4911-bedc-0f254cfdbe1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1283, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.266153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f84dd6-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': 'cbf2b37b19f54b911d9a4c105a55bb95870b2f80395d85a00f3053a4e8ee0379'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.266153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f857c2-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '820fbfc60ccaeb2c743f5540f95ad6dcd902166a043864e68207306ebec6c517'}]}, 'timestamp': '2026-02-23 10:08:56.266666', '_unique_id': '4ad358c149e04ddeb87045cc8cb21b72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.267 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.268 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.268 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.incoming.bytes volume: 6808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f4ea06c-d4d4-419c-8191-e4ae87e06e69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6808, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.268353', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f8a696-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': '7b1a3213ec939d38b37bdc77ab407140511411b55a5991012d93c45d3b7c694c'}]}, 'timestamp': '2026-02-23 10:08:56.268755', '_unique_id': 'd9ad75be96764fedb1eb427f039f071e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.269 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.270 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.270 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 397312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.271 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee743eab-b899-460b-af4a-f8052c363285', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 397312, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vda', 'timestamp': '2026-02-23T10:08:56.270714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a9f90384-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '1046cea73af8b9393d9e24a4c772f042a87a4ceb4f7aef1128d0dda0df632f78'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-vdb', 'timestamp': '2026-02-23T10:08:56.270714', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000003', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a9f9132e-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.386197558, 'message_signature': '636a0baf906ccfed931399bde931cb3a83c2f4691d9fc9bf2f52c646fbcbfe04'}]}, 'timestamp': '2026-02-23 10:08:56.271527', '_unique_id': '28dfc6db346e446180a6509765b355a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.272 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.273 12 DEBUG ceilometer.compute.pollsters [-] c2a7d92b-952f-46a7-8a6a-3322a48fcf4b/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a029a6ad-7b1f-4419-a511-6a82b1003e5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': 'cb6895487918456aa599ca2f76872d00', 'user_name': None, 'project_id': '37b8098efb0d4ecc90b451a2db0e966f', 'project_name': None, 'resource_id': 'instance-00000003-c2a7d92b-952f-46a7-8a6a-3322a48fcf4b-tapa27e5011-20', 'timestamp': '2026-02-23T10:08:56.273613', 'resource_metadata': {'display_name': 'test', 'name': 'tapa27e5011-20', 'instance_id': 'c2a7d92b-952f-46a7-8a6a-3322a48fcf4b', 'instance_type': 'm1.small', 'host': 'a569597c94111a4c9797a36dd86c712480ca3463ae5de21de1d9b3db', 'instance_host': 'np0005626463.localdomain', 'flavor': {'id': 'c13b1f72-534e-4f1d-8659-0e8f3a2c7d53', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f'}, 'image_ref': 'a9204248-210d-45b5-ab0a-d1ec08a73a4f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:a0:9d:00', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapa27e5011-20'}, 'message_id': 'a9f973fa-109f-11f1-b3f4-fa163e9ab6c6', 'monotonic_time': 12776.379948035, 'message_signature': 'c100fc092a8aff870ce958b3beabc8519ac16e08ae12803842920ec9953b0b33'}]}, 'timestamp': '2026-02-23 10:08:56.274052', '_unique_id': '5056f4334d134e7389e08dff56efc6fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging yield Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging conn.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Feb 23 05:08:56 localhost ceilometer_agent_compute[238244]: 2026-02-23 10:08:56.274 12 ERROR oslo_messaging.notify.messaging Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0. Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.548283) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76 Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336548339, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 660, "num_deletes": 251, "total_data_size": 713916, "memory_usage": 726328, "flush_reason": "Manual Compaction"} Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336556342, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 704350, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40698, "largest_seqno": 41357, "table_properties": {"data_size": 701154, "index_size": 1115, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8123, "raw_average_key_size": 20, "raw_value_size": 694445, "raw_average_value_size": 1718, "num_data_blocks": 50, "num_entries": 404, "num_filter_entries": 404, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771841298, "oldest_key_time": 1771841298, "file_creation_time": 1771841336, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}} Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 8100 microseconds, and 3160 cpu microseconds. Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.556382) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 704350 bytes OK Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.556405) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.558616) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.558667) EVENT_LOG_v1 {"time_micros": 1771841336558660, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.558693) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 710401, prev total WAL file size 710401, number of live WAL files 2. Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.559381) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end) Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00 Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(687KB)], [75(17MB)] Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336559427, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 18874748, "oldest_snapshot_seqno": -1} Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 14430 keys, 17438139 bytes, temperature: kUnknown Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336674049, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 17438139, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17358252, "index_size": 42730, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36101, "raw_key_size": 388746, "raw_average_key_size": 26, "raw_value_size": 17115375, "raw_average_value_size": 1186, "num_data_blocks": 1566, "num_entries": 14430, "num_filter_entries": 14430, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771839971, "oldest_key_time": 0, "file_creation_time": 1771841336, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "4cfd6c8f-aafa-4003-b2f6-d22c49635dd4", "db_session_id": "66DAQ76CBLV8DSGL8JC7", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}} Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.674386) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 17438139 bytes Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.676359) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.6 rd, 152.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 17.3 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(51.6) write-amplify(24.8) OK, records in: 14952, records dropped: 522 output_compression: NoCompression Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.676388) EVENT_LOG_v1 {"time_micros": 1771841336676374, "job": 46, "event": "compaction_finished", "compaction_time_micros": 114692, "compaction_time_cpu_micros": 47131, "output_level": 6, "num_output_files": 1, "total_output_size": 17438139, "num_input_records": 14952, "num_output_records": 14430, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336676622, "job": 46, "event": "table_file_deletion", "file_number": 77} Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005626463/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771841336679045, "job": 46, "event": "table_file_deletion", "file_number": 75} Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.559294) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679113) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679119) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679126) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:56 localhost ceph-mon[294160]: rocksdb: (Original Log Time 2026/02/23-10:08:56.679129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Feb 23 05:08:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc. Feb 23 05:08:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d. Feb 23 05:08:57 localhost podman[324129]: 2026-02-23 10:08:57.908810794 +0000 UTC m=+0.079836029 container health_status 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260216, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, managed_by=edpm_ansible) Feb 23 05:08:57 localhost podman[324130]: 2026-02-23 10:08:57.962086481 +0000 UTC m=+0.130540587 container health_status bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Feb 23 05:08:57 localhost podman[324130]: 2026-02-23 10:08:57.974465113 +0000 UTC m=+0.142919219 container exec_died bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Feb 23 05:08:57 localhost systemd[1]: bee8ba29aeb0c3b6a916d1366f5aac446e01450955724b0f85ae8ab1d4d64b3d.service: Deactivated successfully. Feb 23 05:08:58 localhost podman[324129]: 2026-02-23 10:08:58.025047527 +0000 UTC m=+0.196072802 container exec_died 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260216, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Feb 23 05:08:58 localhost systemd[1]: 83e0dfad8e11fc0edee47d0ecab9337f343a62cd8dd545e1b2fa3b528410a3fc.service: Deactivated successfully. Feb 23 05:08:58 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:08:59 localhost sshd[324176]: main: sshd: ssh-rsa algorithm is disabled Feb 23 05:08:59 localhost systemd-logind[759]: New session 72 of user zuul. Feb 23 05:08:59 localhost systemd[1]: Started Session 72 of User zuul. Feb 23 05:08:59 localhost ovn_controller[157695]: 2026-02-23T10:08:59Z|00378|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Feb 23 05:08:59 localhost python3[324198]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163ef9-e89a-f788-b23e-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Feb 23 05:09:01 localhost nova_compute[282206]: 2026-02-23 10:09:01.019 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:09:01 localhost nova_compute[282206]: 2026-02-23 10:09:01.021 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:09:01 localhost nova_compute[282206]: 2026-02-23 10:09:01.022 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:09:01 localhost nova_compute[282206]: 2026-02-23 10:09:01.022 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:09:01 localhost nova_compute[282206]: 2026-02-23 10:09:01.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:09:01 localhost nova_compute[282206]: 2026-02-23 10:09:01.053 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:09:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 05:09:01 localhost ceph-osd[31633]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 9000.1 total, 600.0 interval#012Cumulative writes: 21K writes, 79K keys, 21K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s#012Cumulative WAL: 21K writes, 7586 syncs, 2.88 writes per sync, written: 0.07 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 11K writes, 39K keys, 11K commit groups, 1.0 writes per commit group, ingest: 36.14 MB, 0.06 MB/s#012Interval WAL: 11K writes, 4718 syncs, 2.46 writes per sync, written: 0.04 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 05:09:03 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:09:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739. Feb 23 05:09:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c. Feb 23 05:09:03 localhost podman[324202]: 2026-02-23 10:09:03.922749651 +0000 UTC m=+0.092438979 container health_status be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260216, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ceilometer_agent_compute) Feb 23 05:09:03 localhost podman[324202]: 2026-02-23 10:09:03.934359181 +0000 UTC m=+0.104048529 container exec_died be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260216, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2) Feb 23 05:09:03 localhost systemd[1]: be69a98bc8c30737a588f2504e3463b9254e63c980be4e376d2f104225355f5c.service: Deactivated successfully. Feb 23 05:09:04 localhost podman[324201]: 2026-02-23 10:09:04.029403939 +0000 UTC m=+0.199056865 container health_status 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 05:09:04 localhost podman[324201]: 2026-02-23 10:09:04.038279783 +0000 UTC m=+0.207932719 container exec_died 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260216, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'a07a4436e1d1ca1c6231f11309616a78b1ad9830450b5c2d2fc3fb113cfbf838-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Feb 23 05:09:04 localhost systemd[1]: 11c0efd0e0b3e96f28e1667bfcc14ff7c3c7ffd1c98bac3f3df14178958e3739.service: Deactivated successfully. Feb 23 05:09:05 localhost systemd[1]: session-72.scope: Deactivated successfully. Feb 23 05:09:05 localhost systemd-logind[759]: Session 72 logged out. Waiting for processes to exit. Feb 23 05:09:05 localhost systemd-logind[759]: Removed session 72. Feb 23 05:09:06 localhost nova_compute[282206]: 2026-02-23 10:09:06.054 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:09:06 localhost nova_compute[282206]: 2026-02-23 10:09:06.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:09:06 localhost nova_compute[282206]: 2026-02-23 10:09:06.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:09:06 localhost nova_compute[282206]: 2026-02-23 10:09:06.056 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:09:06 localhost nova_compute[282206]: 2026-02-23 10:09:06.073 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:09:06 localhost nova_compute[282206]: 2026-02-23 10:09:06.074 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:09:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Feb 23 05:09:06 localhost ceph-osd[32575]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 9000.1 total, 600.0 interval#012Cumulative writes: 21K writes, 79K keys, 21K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s#012Cumulative WAL: 21K writes, 7360 syncs, 2.95 writes per sync, written: 0.07 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 11K writes, 39K keys, 11K commit groups, 1.0 writes per commit group, ingest: 33.66 MB, 0.06 MB/s#012Interval WAL: 11K writes, 4546 syncs, 2.50 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Feb 23 05:09:08 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:09:09 localhost podman[242954]: time="2026-02-23T10:09:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Feb 23 05:09:09 localhost podman[242954]: @ - - [23/Feb/2026:10:09:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1" Feb 23 05:09:09 localhost podman[242954]: @ - - [23/Feb/2026:10:09:09 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18833 "" "Go-http-client/1.1" Feb 23 05:09:11 localhost nova_compute[282206]: 2026-02-23 10:09:11.075 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:09:11 localhost nova_compute[282206]: 2026-02-23 10:09:11.103 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:09:11 localhost nova_compute[282206]: 2026-02-23 10:09:11.104 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:09:11 localhost nova_compute[282206]: 2026-02-23 10:09:11.104 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:09:11 localhost nova_compute[282206]: 2026-02-23 10:09:11.105 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:09:11 localhost nova_compute[282206]: 2026-02-23 10:09:11.106 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:09:13 localhost ceph-mon[294160]: mon.np0005626463@0(leader).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Feb 23 05:09:13 localhost openstack_network_exporter[245358]: ERROR 10:09:13 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Feb 23 05:09:13 localhost openstack_network_exporter[245358]: Feb 23 05:09:13 localhost openstack_network_exporter[245358]: ERROR 10:09:13 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Feb 23 05:09:13 localhost openstack_network_exporter[245358]: Feb 23 05:09:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43. Feb 23 05:09:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb. Feb 23 05:09:14 localhost systemd[1]: tmp-crun.HCEgzB.mount: Deactivated successfully. Feb 23 05:09:14 localhost podman[324239]: 2026-02-23 10:09:14.921284012 +0000 UTC m=+0.090092417 container health_status da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Feb 23 05:09:14 localhost podman[324239]: 2026-02-23 10:09:14.955635333 +0000 UTC m=+0.124443728 container exec_died da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Feb 23 05:09:14 localhost podman[324238]: 2026-02-23 10:09:14.970697759 +0000 UTC m=+0.142947410 container health_status 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9) Feb 23 05:09:14 localhost systemd[1]: da3017db5cf229fdef3f2bfb4085c381427336b5c58ec1f7e94d11cfbeb2c8eb.service: Deactivated successfully. Feb 23 05:09:14 localhost podman[324238]: 2026-02-23 10:09:14.989394247 +0000 UTC m=+0.161643938 container exec_died 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=9.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '69b16e35d788e45bd4dc8fdbb5a12a82a2f32a41425743b954eabdb608a706c0'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-type=git) Feb 23 05:09:15 localhost systemd[1]: 6cb5576e612c42c1aa5a9a12d920a8ac121c3bbe97b17ae6c4c7aa9081ec5b43.service: Deactivated successfully. Feb 23 05:09:16 localhost nova_compute[282206]: 2026-02-23 10:09:16.107 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:09:16 localhost nova_compute[282206]: 2026-02-23 10:09:16.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Feb 23 05:09:16 localhost nova_compute[282206]: 2026-02-23 10:09:16.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Feb 23 05:09:16 localhost nova_compute[282206]: 2026-02-23 10:09:16.109 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:09:16 localhost nova_compute[282206]: 2026-02-23 10:09:16.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Feb 23 05:09:16 localhost nova_compute[282206]: 2026-02-23 10:09:16.141 282211 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Feb 23 05:09:17 localhost sshd[324282]: main: sshd: ssh-rsa algorithm is disabled Feb 23 05:09:17 localhost systemd-logind[759]: New session 73 of user zuul. Feb 23 05:09:17 localhost systemd[1]: Started Session 73 of User zuul.